10. August 2019

AI and the cloud – a universal panacea?

google datacenter

AI has evolved into a real buzzword these days. But it’s such an emotionally loaded term that there’s now a wide gap between the idea and the reality. This article attempts to explain what AI is already capable of and why cloud technologies have such an important role to play in it.

Just what are we talking about actually? “Artificial Intelligence” (AI) refers to the basic idea of getting machines to perform “smart” tasks. The subordinated term “Machine Learning” (ML) by contrast refers to the learning process by which a system acquires a given behavior based on data. In general, it can be said that ML is learning based on patterns, examples and experience. 

However, today’s algorithms are still very simple. Until now, programming has primarily been concerned with defining rules. Machine learning is the program that identifies these rules for us and continually improves them over time on the basis of data.

 

AI is not all high tech and robotics

Artificial Intelligence is often associated with robots that may serve us at some future point in time. This picture is exaggerated by the media – AI often quite simply means that smart algorithms can use data to support processes. We already benefit from the assistance of many cloud-based AI applications: Google Maps automatically shows us the quickest route to our destination, households use intelligent devices to measure and independently adjust their electricity consumption, and the best email spam filters use AI. In business, too, machine learning can perform many practical tasks, from assisting with resource allocation to predicting and responding to future customer needs through to automating tedious and repetitive jobs.


Pretrained AI systems from the cloud

Various technology providers are keen to make artificial intelligence more accessible and are making available interfaces, known as APIs, that enable any organization to integrate pretrained AI systems into their own processes. Many APIs are already used for image analysis. Google’s AutoML, for instance, can be used to train an in‑house image recognition model by feeding it with predefined image type templates that the model learns from and that enable it to recognize and identify, with relatively little effort, a wide range of images, e.g. of screws.

Data are an important part of AI development. And data‑driven tasks – that is to say, information‑based challenges – are a prerequisite for any AI application. However, these do not have to be present within your own company. A large number of digital datasets are made freely available by communities in­terested in a wide variety of issues, from wine ratings to urban sounds through to consumer behavior on Black Friday, in the form of texts, satellite images and videos. 

Many businesses are already using image recognition and identification, and are able to train it with increasing specialization. With AutoML, Google Cloud is making a system available that anyone can use to build and train their own ML model without much prior knowledge. 

Two examples: Disney uses AutoML Vision to assign products and product photos rapidly to specified categories in its online store. So a product manager might upload photos of a T-shirt with a Spider-Man design to the website, and the product is immediately tagged with “Spider-Man,” “T-shirt,” “Marvel” 
and “Superheroes.”

The Zoological Society of London (ZSL) has pledged to protect animal species worldwide. In order to be effective in fulfilling this mission, they have installed so‑called camera traps across the globe. This enables them to count and record animals in their natural habitat. ZSL has used AutoML Vision to train an ML model to recognize which animals have been photographed, enabling them to be automatically classified. The work of many weeks thus becomes the work of only a few hours or a couple of minutes.  


Computing power from the cloud

What does all this have to do with the cloud though? Machine learning, deep learning and artificial intelligence in general 
require huge amounts of processing power. One simple way to acquire this is to hire high-performance hardware in data centers that users can access via the Internet. The latest technology for generating significant improvements in processing power for AI systems is known as Tensor Processing Units (TPU). TensorFlow is a key component of many AI and ML systems. The TPU 
processor developed by Google is built to run the TensorFlow open source AI framework at exceptional speeds, enabling AI systems to run between 15 and 30 times faster. This is equivalent to a leap of seven years into the future compared to previous development cycles. We have installed large numbers of so‑called pods in our data centers, to which TPUs can be connected. One pod alone delivers 11.5 petaflops of performance. Companies can scarcely maintain this sort of hardware at the local level. For this reason, cloud computing plays a key role in the development of AI systems.


The three points for successful use of AI are:

  1. It requires high-quality datasets that ML systems can use to recognize patterns. Ideally, ML systems should be trained with data that are representative of the real world.
  2. Good tools and frameworks are essential. Although basic ML algorithms can be described in just a few minutes, they are pretty complicated to implement. It is therefore necessary to have a series of services that do not require ML or programming know-how.
  3. Immense processing capacity is required. The cloud provides precisely this kind of high‑performance hardware that businesses and developers can use for their own ML models. 

These simple steps will ensure you make a successful start with AI in the cloud. 


Christian  Sciullo
Christian Sciullo

Christian Sciullo is an economist (lic. oec. HSG) with over 20 years’ experience in the technology sector helping clients to implement new technology and cloud solutions. He joined Google’s Cloud division in 2013. Since January 2018, he has been Google Cloud Country Manager for Switzerland and Austria.

Weitere Beiträge

Homeoffice_work_business
Banking – made in Swiss Homeoffices

Wandelt sich das Homeoffice nun auch für die Schweizer Bankenwelt zur Normalität? Zumindest ist sicher, dass die Covid-Krise langfristige Auswirkungen auf die Bankenkultur haben wird. Anhand von vier Hypothesen zeigen wir mögliche Veränderungen auf.

find more information
Blog 10_Titelbild
In der Mitte wird es eng – Wie AI die Arbeitswelt und damit unser Leben verändern wird

Die Potenziale der AI-Technologie sind gross, werden aber häufig überschätzt. Zeit, um einen kritischen Blick auf die Möglichkeiten und Auswirkungen auf die Gesellschaft zu werfen.

find more information
Helsana+ App
Ab in die Cloud – wie ti&m Helsana+ in die Cloud brachte

Das App-basierte Bonusprogramm Helsana+ belohnt gesundes Verhalten und Treue der Nutzer. Mit ti&m und der Cloud machte Helsana dieses fit für die Zukunft. Dazu haben wir das bestehende Back-end komplett neu als cloud-native Applikation geschrieben.

find more information
HackZurich: This Is How 565 Techies Spent the Weekend
HackZurich: This Is How 565 Techies Spent the Weekend

565 registered participants, 152 submitted projects and 20280 hours of coding – the occasion was of course Hack Zurich, Europe’s biggest hackathon and one of the most anticipated events for ambitious hackers. The rainy weekend was the perfect opportunity for hackers from all over the world to get together in the Technopark and develop innovative solutions and new apps for the sponsor’s challenges. The goal: develop an amazing prototype and maybe, even disrupt a whole industry.

find more information
Blog 7_Titelbild
"With AI projects, you need a long breath"

RobecoSAM has been developing sustainability indices for large quoted companies for 20 years. In order to cope with the growing mountains of unstructured data, RobecoSAM implemented an AI solution together with ti&m. Manjit Jus, Managing Director, Head of ESG Ratings at RobecoSAM, talks about how the project went and what the companies learned about themselves during the process.

find more information