The reason there’s so much hype around cloud computing is the promise that it will reduce infrastructure costs while providing compute and storage capacity on demand. But, of course, moving to cloud computing doesn’t necessarily guarantee cost savings.

In the latest reminder of that dichotomy, a survey of 1,300 businesses in the United States and United Kingdom released last week by Rackspace showed that 66 percent found cloud computing has reduced their IT costs, while 17 percent said it failed to do so. The remainder had no opinion. Yet another survey commissioned by Internap, which runs 12 datacenters throughout the United States, primarily for colocation but also for its cloud computing business, suggests that of the 65 percent who said they are considering the use of cloud services, 41 percent expect them to reduce their costs.

This obviously isn’t an apples-to-apples comparison since, among other variations, the Rackspace study surveyed those who already use cloud services while the Internap survey didn’t query only those running apps in the cloud. But the two surveys offer some interesting data points on the role costs play in determining the value of using cloud computing services.

“It used to be debatable whether the cloud was saving money or not, but apparently the businesses we surveyed believe it is saving them money,” said Rackspace CTO John Engates in an interview last week.

But depending on your application, cloud computing can actually cost more, warned Raj Dutt, senior VP of technology at Internap. That’s especially the case for applications that have consistent and predictable compute and storage usage, he explained.

“People move to the cloud for perceived cost savings and what we’re finding is it gets really expensive compared to colocation, [particularly] if you look at the three-year overall total cost of ownership of an application that is pretty constant,” Dutt said. Read more.