Google is talking more openly about companies that use its cloud business, and revealing more about its computing resources, perhaps the largest on the planet. These include disclosures about Google's ultrafast fiber network, its big data resources and the computers and software it has built for itself.
The aim is to position Google as a company capable of handling the biggest and toughest computational exercises, lightning fast. The disclosures follow earlier moves by Google Cloud Platform, as the search company's cloud computing business is called, to show off its data analysis capabilities.
Details like the ability to pass information between Europe and the United States in less than 100 milliseconds, and a practice of fully backing up user data in nine different locations, make Google seem both cutting edge and even bigger than most people suspected. But the company may also be borrowing a playbook from Amazon Web Services, which in 2013 started disclosing some mind-blowing metrics about its global computing network.
At an event Tuesday for Google Cloud Platform - Google's name for the computing, storage and networking it sells to business - Google will name the Taiwanese phone maker HTC as a customer. HTC has used Google to build a new kind of computing architecture that enables smartphone apps to update data fast and reliably to many devices at once, and look efficient even when the phones get poor reception.
On Wednesday, a senior Google executive is expected to give what the company says will be an unprecedented look at the overall Google network design. This includes key tools that enable large-scale management of computing devices across the globe, according to one customer.
"We are managing 2 million to 3 million smartphones in this network," said John Song, the senior director of engineering at HTC. "Google is the only player in cloud that owns lots of fiber-optic cable worldwide, and it replicates its users' data in nine different places at all times." That kind of control, he said, enables users to do more technically difficult things.
Song said HTC also looked at the cloud offerings of AWS and Microsoft Azure, along with IBM and Alibaba. Google was the dark horse, because it does not operate in China, and HTC wanted to be everywhere in the world. Google's technical dedication won the day.
"The other salesmen just wanted to take orders," he said.
That may not be a winning approach for a mass of customers, but it does suggest how each cloud player is reflecting the nature of its core business.
Amazon, a retailer, is offering computing at scale and ease of use in data analysis. Microsoft, with decades of business ties, stresses its interoperability with current systems and data tools. And IBM has lots of high-level data analysts.
"Each of the cloud companies wants to store and process all of a company's data," said Sharmila Mulligan, co-founder and chief executive of ClearStory Data, a corporate data analysis company, who described all of the companies as "extremely partner-friendly" with companies like hers, owing to their urgency to offer more big data services.
"Data analysis presents a stickiness - once you've put your corporate data in one of them and you're analyzing it, you won't move out," she said. "You'll be constantly ingesting new data - how could you afford to move? If they own your data, they own you."
Nobody crunches data like Google, and the new portrait of its resources is intended to show that handling almost any corporate task will be easy for it. Google Cloud Platform has built out specialties in areas like manufacturing, genomics and media to handle industry-specific needs on a global basis.
"We have 90 points of presence around the world," said Dan Powers, a Google Cloud sales executive, referring to places customers can get access to the Cloud.
A point of presence, or PoP, is where a company's computers get direct access to the Internet and a local telecommunications service provider.
Leah Bibbo, a spokeswoman for AWS, said her company currently had 53 PoPs worldwide. Also Tuesday, AWS announced that it would offer data analysis in Spark, a technology for handling very large data loads that is rapidly overtaking existing methods.
Powers also said Google runs 2 billion containers, or ways of managing software globally, every week: "We hope to run billions for customers."
© 2015 New York Times News Service