Enterprises now amass huge amounts of data, both from their own tools and applications, as well as from the SaaS applications they use. For a long time, that data was basically exhaust. Maybe it was stored for a while to fulfill some legal requirements, but then it was discarded. Now, data is what drives machine learning models, and the more data you have, the better. It\u2019s maybe no surprise, then, that the big cloud vendors started investing in data warehouses and lakes early on. But that\u2019s just a first step. After that, you also need the analytics tools to make all of this data useful.<\/p>\n
Today, it\u2019s Microsoft <\/a> turn to shine the spotlight<\/a> on its data analytics services. The actual news<\/a> here is pretty straightforward. Two of these are services that are moving into general availability: the second generation of Azure Data Lake Storage for big data analytics workloads and Azure Data Explorer<\/a>, a managed service that makes easier ad-hoc analysis of massive data volumes. Microsoft is also previewing a new feature in Azure <\/a> Data Factory, its graphical no-code service for building data transformation. Data Factory now features the ability to map data flows.<\/p>\n Those individual news pieces are interesting if you are a user or are considering Azure for your big data workloads, but what\u2019s maybe more important here is that Microsoft is trying to offer a comprehensive set of tools for managing and storing this data \u2014 and then using it for building analytics and AI services.<\/p>\n