PASS Summit 2017 Day 3

Posted by

The last day of PASS Summit 2017 was definitely not the least! Yet again we had some great engagement with one of the Microsoft product teams, this time the Analysis Services team which discussed the roadmap with us and a group of other MVPs and insiders. Today’s sessions we visited were about the Azure Service Bus, advanced SQL security (incl. the new SQL Vulnerability Assessment), and a demo heavy session by Christian Wade, one of the Analysis Services program managers.

Get on the Azure Service Bus by Stuart Moore

Sessions about services like the Azure Service Bus are not really common at a data platform conference like PASS Summit, but in our opinion more than welcome! Stuart gave us an introduction to Azure Service Bus and showed us some design patterns in which the Service Bus was positioned between databases and applications.

Service Bus Design Patterns

The Service Bus holds messages in a message storage queue. Messages contain headers, which are key/value pairs, and a body, which can be whatever you want, for example, raw text, JSON, XML, binary, compress JSON, etc. It is possible to assign topics to a message to distinguish different types of messages from each other.
Each message has a Time To Live (TTL), which is 14 days by default. After this period the message is removed or moved to the Dead Letter Queue if configured accordingly. The message queue can be partitioned. Each partition runs on its own Azure infrastructure, so if one of the partitions suffer from an outage, the other partitions in the Service Bus stay online and available.

Azure Logic Apps are capable of sending/receiving messages to/from Azure Service Bus queues. This can be really useful in some integration scenarios, especially because Logic Apps have a very rich set of connectors which make it possible to read/write data from/to almost any data store.

Introducing the newest innovations in advanced SQL security by Ronit Reger

Microsoft is heavily investing in security at the moment as it are turbulent times. Some facts: 2 billion records have been compromised last year, on average there are 140+ days between infiltration and detection, and data breaches cost organizations more and more. Next to that, GDPR is coming and Microsoft is committed to help customers meet this compliancy.

SQL Database security is layered and consists of many different services and technologies. The outer layers consist of services that take care of vulnerability scanning, monitoring, automated threat detection and auditing. The inner layers contain database level security features like the firewall, authentication, permissions and encryption.

SQL security in layers

The SQL Vulnerability Assessment service is now in Public Preview, it is now available in the Azure Portal for Azure SQL Database and in SQL Server Management Studio 14.3 (coming) for use with SQL Server 2012 and up. It runs a number of built-in checks on your SQL database, identifies potential risks and gives advice on how to mitigate them. In some cases a SQL statement which can resolve the issue is given automatically and can be executed right away. The service also checks things like firewall configuration, encryption, separation of duties, auditing and data protection. It also tracks permissions and authorizations and identifies and classifies potential not-secured sensitive data. The scan is lightweight, read-only and runs on the metadata of your database. It usually runs within a few seconds, independent on the size of the database.

SQL Threat Detection detects potential vulnerabilities, SQL injection attacks and unusual behavior. It can very easily be turned on from the Azure Portal. Security alerts, complete with recommendations on how to investigate and remediate, are sent by email when suspicious events occur. Threat Detection is general available and only exist for Azure SQL Database.

SQL Universal Authentication brings Multi-Factor Authentication (MFA) which gives an additional level of authentication to help secure access to sensitive data. Microsoft’s goal is to support MFA for all SQL tools and services that connect to Azure SQL Database in the first half of 2018. This includes SQL Server Data Tools, Visual Studio, SQLCMD, BCP and SQL drivers. MFA can also be used in Azure B2B scenarios which make it possible to, for example, use a Gmail account to login with MFA enabled.

VNET Service Endpoints are now in preview. This makes it possible to restrict access to the database from VMs in a given VNET/subnet. On the roadmap is to be able to remove Azure SQL Database as a public endpoint (public IP), removing outbound to SQL Database IP on NSGs, configure VPN / Express Route private peering and the ability to assign private IPs to Azure SQL Databases.

Transparent Data Encryption (TDE) data encryption is enhanced with bring your own key support from Azure Key Vault.

Always Encrypted using Secure Enclaves is now in early access preview.

Creating Enterprise Grade BI Models with Analysis Services by Christian Wade

This demo-packed session revealed multiple new features for creating and managing large and advanced data models. Christian Wade – Senior Program Manager of the Analysis Services team – started the session with demonstrating how to manage an Azure Analysis Services server using a few lines of PowerShell code. You can use this in Azure Cloud Shell or Azure Automation. I have written a blog post about this topic before, check it out here: “Azure Analysis Services scheduled autoscaling and pausing”.

A new REST API is coming for Azure AS, for example to process tables. The API will provide a more reliable experience compared to the existing HTTP connection which cannot always handle long connection durations very well.

Chris then continued to show how to support the scenario of ‘incorporating’ or ‘merging’ a Power BI Desktop data model into the Enterprise environment (e.g. Azure AS). You can use the import wizard inside the Azure Portal to extract the model and utilize BISM Normalizer if necessary to detect changes and merge them into existing Enterprise data models. Worth mentioning is the solution blogged about by Kasper de Jonge, to connect your existing Power BI Desktop file with report pages to your new Azure AS data model.

The next demonstration was around the ‘detailrowsexpression’ property and how you can use this to enrich the experience when a user – in Excel – drills to see details. The property allows you to specify a DAX expression to return a specific dataset at the aggregation level you desire. For now this scenario is not yet available in Power BI.

We then saw how to leverage Object-Level Security, a relatively new feature available. It provides you with all the flexibility you need to secure your Enterprise model, even helping prevent unauthorized users from discovering a column (name).

Perhaps the biggest feature on the show, uhh session, was the integration of Azure AS with Azure Log Analytics! A large selection of Extended Events will be captured, processed and saved automatically. The selection includes the most common events, initially 20, prioritized by MVP feedback. The events are available for analysis in the Log Analytics portal, OMS (alerting anyone?), and last but not least we will be able to connect using Power BI Desktop! The Power Query script (M) will even be generated for you with the click of a button.

NDA Focus Group session with the Analysis Services team

We concluded the PASS Summit with another Focus Group session, this time with the Analysis Services team! We were able to discuss roadmap items and provide our feedback. The session was packed with exiting upcoming features, but since they were under NDA we cannot share any details!

Thanks Cristian, Bret and the rest of the team for a great session and nice to meet you in-person here in (snowy) Seattle!

To conclude…

It was a succesfull and amazing event. We hope to see everyone again next time!
Cheers, Dave & Jorg

Jorg Dave

Leave a comment