It’s not a surprise that businesses such as Box, Issuu, Pinterest, and Netflix are big users of Amazon AWS. When bandwidth and mass storage collide, AWS is always going to be a smart option.
Nor should it be a surprise that any number of no-longer not-so-small start-ups have seen Amazon AWS as a way to quickly scale their businesses while conserving scarce start-up capital. Just look at the meteoric growth of Slack and Airbnb to see why a reliance on the traditional data centre model was never going to work.
But all these examples are also examples of smart ideas where Amazon AWS is the means to faster, better—and cheaper—execution. The ideas in question would work in the world of traditional data centres, just not so well.
And what’s exciting to us at Matillion is the emergence of a growing number of businesses and use-cases where—as at Matillion—the smart idea itself owes a lot to Amazon’s rich ecosystem of AWS services and capabilities.
You can see this reflected in more and more of Amazon’s own announcements.
Once, these announcements were of businesses or use-cases where at most just one or two AWS capabilities were involved: AWS S3 and AWS EC2, for instance.
To us, that spells ‘data centre replacement’, with AWS providing a convenient, scaleable, and secure cloud-based alternative to landlocked on-premise technology. Sensible, yes. Exciting? Not really.
Far more exciting are the growing numbers of instances of much greater numbers of AWS capabilities and services being pieced together, or where AWS capabilities and services are leveraged by clever open source applications.
The result? Something wholly new, different, and—yes—genuinely exciting. Something that might never be cost-effective, viable, or even possible in the traditional world, but which is both practical and affordable in the world of AWS.
Maps come to life
One of the hottest prospects is location analytics, a branch of geospatial intelligence. The idea: blend digital mapping with external datasets containing—say—demographic, socio-economic, or environmental data.
Now, location analytics and geospatial intelligence aren’t new. But any business or research organisation wanting to get to grips with them faces high barriers to entry.
The computing requirements are fairly hefty, for instance—especially if the goal is real-time location analytics, leveraging the signatures of mobile smartphones, or Internet of Things devices. Proprietary geospatial intelligence solutions can be costly. Data volumes can be vast, depending on the geography and population size in question. And so on, and so on.
Cloud + open source analytics + Big Data = ?
Fairly obviously, Amazon AWS and open source analytics can have a huge impact on these barriers. The unachievable and the unaffordable suddenly become both achievable and affordable.
And possibly, routine. What might be possible if geospatial intelligence could be run as a routine business-as-usual application? No one really knows.
But collectively, we’re about to find out, as a confluence of Amazon AWS S3, Amazon AWS EMR, Amazon Redshift and the R- and Python-based open-source distributed analytics environments SparkR and PySpark bring location analytics and geospatial intelligence within the reach of just about any business.
Nor is this theory, or projection. Just last week, Amazon AWS’ Gopal Wunnava showcased just such an application, using Amazon AWS EMR, Amazon AWS S3, and the distributed computing environment SparkR.
Location analytics, we’d stress, is just one example of where cloud computing—and in particular, Amazon’s own AWS take on cloud computing—is poised to re-write the art of the possible.
Disruptive new businesses—and very possible, disruptive new business models—are around the corner.
And here at Matillion, we’re proud to play our own small part in that.
To find out more about one of AWS’s most popular services, Amazon Redshift, download our free eBook below