This day started with a keynote session with Bill Graziano (Blog | Twitter), PASS president, and Quentin Clark (Blog), corporate vice president in the data platform group at Microsoft. Bill presented some fascinating statistics about PASS, such as the number of events worldwide, the number of members and the number of training hours provided around the globe. Quentin announced the release of SQL Server 2014 CTP2 and presented all the new exciting capabilities, such as In-Memory OLTP, Windows Azure HDInsight and Power BI for Office365. You can read all about it here. One of the exciting things about the keynote is to be able to see all 5,000 attendees in a single room. It’s big!
Next, I attended a session by Thomas LaRock (Blog | Twitter), a technical evangelist at Confio Software, and Tim Chapman (Blog | Twitter), a premier field engineer at Microsoft. The title of the session was Query Performance Tuning: A 12-Step Method. Thomas talked about the importance of having a defined process for query tuning, and together they presented the 12 steps of the suggested process.
Generally speaking, I was disappointed by this session, because it was quite an intermediate level, and I didn’t learn anything new. But Thomas and Tim definitely did a great job in presenting the concepts and keeping the audience interested and interactive.
The next session I chose was about memory internals in SQL Server 2012, and it was presented by Bob Ward (Blog | Twitter), principal architect escalation engineer at Microsoft CSS. This session was meant to be level 500, but it was more like level 800. It was 3 hours of very deep review of the memory architecture in SQL Server. I really tried to follow Bob throughout the session, but it was too much for me. I think I managed to understand around 40% of the presentation. The main problem is that there was nothing in this presentation that I could take away and implement. There is no doubt that Bob has a deep understanding of the memory architecture in SQL Server, but the stuff he presented was simply not practical, at least for me. If you want to give it a shot, you can download the presentation from here.
My last session today was Automate Your ETL Infrastructure with SSIS and PowerShell by Allen White (Blog | Twitter), practice manager at UpSearch. This session was like a cooking lesson. The ingredients were: SSIS, BIML, .NET, SMO and PowerShell. Allen took all of these technologies, presented them very clearly, one-by-one, and built a complete solution based on all of them. The solution was an infrastructure that generates ETL packages for a given source and target databases dynamically.
I find each of the technologies mentioned above very interesting, and I think that each of them has a lot of relevant use cases. But I’m not sure that the combination Allen presented is the right way to go. First of all, there is an object model for SSIS, so instead of generating a BIML that will later generate packages, it seems to me more straightforward to use the SSIS object model in order to generate the packages directly and not even bother to learn and be familiar with BIML. Second, the specific example that Allen used is a very simple one, which only merges a list of known tables with the same schema at the source and target databases. In this simple case, I would prefer to use other solutions, such as the MERGE statement or even transactional replication. But I realize that the scenario Allen presented was crafted this way in order to be simple enough for demonstration purposes. In real-life scenarios, I’m sure that the solution Allen presented or some variation of it can certainly be the right way to go. You can download all the materials from this presentation here.
Tomorrow is a new day with new sessions, new people and new experiences.
See you tomorrow!
Latest posts by Guy Glantser (see all)
- Parameterization Part 5: Two Common Mistakes - 20 September 2014
- DBA in the Cloud: Threat or Opportunity? - 16 September 2014
- Parameterization Part 4: Handling Non-Uniform Data Distribution - 8 September 2014