Amazon Web Services (AWS), the cloud computing arm of e-commerce major Amazon, has made a number of announcements focused towards cloud services, data management, intelligent automation, a new chip (hardware) in the cloud and many more at this year's AWS re:Invent. The cloud provider's mega event returned in Las Vegas, US, as the first in-person event since 2019, and the 11th overall and is attended by more than 50,000 in the city while more than 300,000 attended virtually.
Here is a list of nine key announcements the cloud major has made so far at this year's AWS Re:Invent event.
#1 A new ML-based data management service
At its re:Invent conference, AWS announced Amazon DataZone, a new data management service that can help enterprises catalogue, discover, share and govern their data. The cloud provider said, it is using machine learning (ML) to help businesses build these data catalogues and generate the metadata to make it searchable.
On his keynote on Tuesday, November 29, AWS CEO Adam Selipsky said that the tool will provide users with fine-grained controls to manage and govern this data.
"That’s long been a major problem for enterprises, especially as data has increased phenomenally, ensuring that users have access to the right data, without compromising personally identifiable information," he said, adding that DataZone enables you to "set data free throughout the organization safely by making it easy for data engineers, data scientists, product managers, analysts and other business users to discover, use and collaborate around that data to drive insights for your businesses.”
#2 Aiming at a zero-ETL future
AWS announced that it is helping enterprises move data management toward a future without the need for extract transform load, or ETL, which has always been a challenge for data scientists and engineers to get data into shape to put it to work. "That's because you may have data in a number of different places like your application usage data in a database and customer reviews in your data lake. Putting them together has been a significant challenge," explained the CEO.
AWS introduced Aurora zero-ETL integration with Amazon Redshift to give customers using the Aurora database and the Redshift data warehouse the ability to move data without having to perform ETL on it. The cloud provider also announced a similar integration between Amazon Redshift and Apache Spark, the company's open-source big data processing platform. This would simplify moving data between the two platforms without having to extract, transform and load first, according to the company. The Redshift-Aurora integration is in preview. The Redshift-Apache Spark integration is available now across all regions.
#3 Managed compute for spatial simulations
Until now if an enterprise wanted to scale up their spatial simulation, they had to balance the accuracy of the simulation with the capacity of their hardware. This made managing simulation infrastructure a complex job. AWS said that it is trying to solve the problem by announcing a new managed compute service that helps customers build, operate and run large-scale spatial simulations.
With AWS SimSpace Weaver, as the service is called, the company said that customers can deploy spatial simulations to model dynamic systems with many data points, such as traffic patterns across an entire city, crowd flows in a venue or factory-floor layouts, and then use the simulations to visualise physical spaces, perform immersive training and get insights on different scenarios to make informed decisions.
#4 Streamlining BI
The company also announced new capabilities to help customers streamline business intelligence (BI) using Amazon QuickSight, its serverless BI cloud service. The cloud provider expanded of QuickSight Q, a natural language querying capability, to support forecast and “why” questions and automate data preparation.
This means users can now "create and share paginated reports alongside interactive dashboards, quickly analyze and visualise billion-row datasets directly in QuickSight, and programmatically create and manage BI assets to accelerate migration from legacy systems," said Swami Sivasubramanian, vice president of AWS database, analytics, and machine learning.
This means, with QuickSight made more intuitive, flexible and accessible while streamlining BI operations, companies can use the service for a variety of scenarios, such as sales forecasting while using natural language queries, embed analytics in high-traffic websites or even analysing massive datasets.
#5 Analysing data in a secure environment
AWS' new Clean Rooms is an analytics service for organisations that need to work on mutual data but without being able to access each other's raw data. It offers a broad set of built-in data access controls, including query controls, query output restrictions, query logging allowing businesses to customise restrictions on the queries executed by each clean room member. It also offers advanced cryptographic computing tools that adhere to strict data-handling policies and keeps the data encrypted even when queries are processed, thereby maintaining user privacy.
#6 Ready to compete with chipmaking partners
AWS is essentially a software company, apart from when it’s making hardware. But this year, it has rolled out new chips designed to enter high-performance computing tasks such as weather forecasting and gene sequencing. The cloud provider said it would let customers rent computing power that relies on a new version of its Graviton chips. Also, under its Graviton, Trainium and Inferentia chip brands, AWS also makes software-defined hardware devices known as Nitro Cards.
According to experts, AWS making its own chips will give its customers more cost-effective computing power than they could get by renting time on processors built by chipmakers Intel, Nvidia or Advanced Micro Devices. The move has put AWS in direct competition with those companies, which are also among its biggest partners and suppliers, believe experts.
#7 A standard-based data lake for security data
Security Lake is a service that centralises security data automatically for an organisation from cloud and on-premises sources into a purpose-built data lake in a customer’s AWS account so that customers may act on security data faster. According to a security analyst present at the event, "this can be of help to enterprises centralising all of their security data in a single data lake, using a standards-based format to manage the life cycle of this data". Also, the solution automatically collects and aggregates security data for AWS partner solutions including Cisco, CrowdStrike and Palo Alto Networks and lets teams add their own data sources, such as from internal applications or infrastructure logs.
#8 Taking a new AWS supply chain route
AWS comes up with a solution for the supply chain disruptions in the post pandemic world. It has launched AWS Supply Chain, a new application, designed to create supply chain visibility to make faster, more informed decisions that mitigate risks. Selipsky said, the new solution gives a "unified view of your supply chain data, ML-powered insights, recommended actions and built-in collaboration capabilities, so you can react quickly to unexpected issues.” A good example would be the web interface can allow businesses to see all of its office, warehouse or other locations and view their stock inventory levels.
#9 New ways of using ML for climate science, disaster response, agriculture
In his keynote on Wednesday, November 30, Sivasubramanian announced eight new capabilities for Amazon SageMaker. One of these includes a new level of support for geospatial data, which will make it easier to develop models for climate science, urban planning, disaster response, precision agriculture, and more. "These new Amazon Sagemaker capabilities are built for customers to take advantage of ML at scale and for social change," he said.