Preparing a Technical Session Part 2: Coming up with a Title

Earlier this month I began a blog series on preparing to do a technical talk. In that first post I discussed some strategies for coming up with a topic, which is ultimately the first step in your preparation.

In this post I’ll walk you through some tips for coming up with a title for your presentation. Just as a reminder in this blog series on preparing a technical session I’ll cover the following steps:

  1. Picking a Topic
  2. Coming up with a Title
  3. Writing an Abstract
  4. Building the PowerPoint
  5. Building the Demos
  6. Delivering the Presentation

Coming up with a Title

You may be thinking once you’ve come up with a topic you’ve got the title nailed down too but I actually see these are two different things. The topic is the general idea of what you’ll be talking about.  This gives you a guide one how you’ll work out the details. The title is one of those details. For example, my topic might be “Intro to SSIS” but I make the title “Getting Started with Integration Services”. This title is very clear on what the topic is and what depth the audience can expect.

Why is it important to have a good title? Well let’s be honest many people attending conferences probably only look at a small version of the schedule that doesn’t include the detail abstract so the title is all they have to go off. If your title is not clear on what you’ll be discussing then why would someone attend.  Let’s go a little deeper and look at some tips you may want to use when naming a session.

Don’t be too Cute

Do you consider yourself a creative person? Do you like to be unique and stand out in a pack? Good, now stop it! I’m kidding to some extent. You want your session to stand out and if the only thing people have to go by is the title than you may want it to be a little more creative or zany with your session title. There’s nothing wrong with this as long as it’s still very clear what your session is about. Here’s a couple good examples of being creative with your session title while still being clear and interesting:

  • Help! I’m a new DBA, Where do I start?!
  • DBA Mythbusters
  • Triggers: Born Evil or Misunderstood?

These sessions stand out but are still very clear what the talk will be about. On the opposite end of the spectrum here some session titles that are certainly unique but I have no idea what to expect if I were to attend.

  • SQL Server: We’re not in Kansas Anymore
  • Kill “BI”ll Quentin Tarantino Style

These sessions tried too hard to be cute and went past being unique to just being confusing.

Short and Sweet

I’ve often made the mistake of wanting to be so clear about my session that my title starts to look more like a poorly written paragraph. The intent is to eliminate confusion but what happens is without knowing it you begin talking people out of your session just by them reading the title. Here’s one I wish I could have back:

  • Using SQL Server 2014 to Build Analysis Service Multidimensional Cubes

Good topic, but a poorly written title. Way too long and complicated when it really didn’t need to be. If I could rewrite it I would simply name it “Building Analysis Services Cubes”. All of the other details that I decided to put in the title should have been saved for the abstract section. That way if someone was wondering which version of SQL Server I would be demonstrating then they could read the abstract to find out more.

Use Active Language

Using active language is a good method for making certain that your session topic is clear. This helps your potential audience know in many cases the kind of demos (if any) to expect. Words like “Building” or “Developing” tell your audience that they can expect demos. For example, if I went to a session titled “Developing Reporting Services Reports” I would expect demonstrations not PowerPoint slides that show me how to develop a reports. Here’s some examples of session using active language:

  • Building Dashboards with Your SalesForce Data
  • Overcoming Data Warehouse Design Challenges
  • Getting Started with Indexes

I don’t think it’s mandatory that you use this tip for every session you do but if you’re struggling for a title then this may help.

Getting Started with the Power BI Dashboards Public Preview

If you’re currently an Office 365 customer that is using Power BI sites then you probably found an early Christmas present this morning when you logged into your Power BI tenant.

Today Microsoft announced the Public Preview of Power BI Dashboards.  If you watched this PASS Summit Keynote (fast forward to the last 10 minutes) then you got an early glimpse of what the new tool will look like.

If you want to start using this preview today then here’s what you need:

  • Must be an Office 365 customer (Corrected)
  • Must have SharePoint online (Corrected)
  • Must have a Power BI tenant (Corrected)
  • Currently you must be a US customer

Getting Started

To get started you’ll first need to login to your Power BI tenant. Once you’ve logged in you’ll find in the top right the ability to launch the public preview by click “Try it now” as shown below.


Once this launches you may be prompted to sign in again. Once you’re in you will immediately see a sample dashboard called Retail Analysis Sample that gives you an idea of what you can do in the preview.


If you want to get started with your own dashboard you must first find the data you want to visualize.  Click Get Data on the left side of you screen to get started.


You can see the current data sources available to you on the left side of the screen.


Now if this seems like an incomplete list don’t worry. You actually have the ability to connect to any data source that you would typically pull in with Power Query by using the second option on the list called Power BI Designer File.

The Power BI Designer is a companion application for building Power BI Dashboards.  I’ll discuss more about this in a future post but for now just know that it’s a client tool that combines the capabilities of Power Query and Power View without the need for Office 2013 Professional Plus being installed.  Huge bonus for some eager to try Power BI but running non-Professional Plus versions of Excel.

Probably this simplest way for you to get started is by pointing the preview to an existing Power Pivot workbook you have. So in my example I’ll select Excel workbook and then click Connect.

I really love what you see next. My assumption would be that I would have to upload my workbook but fortunately that’s just one option.  You can also connect directly to your workbook if they are currently stored in either OneDrive or OneDrive for Business. I am a big OneDrive fan especially since Microsoft now allows Office 365 customers to have unlimited storage.


So I select OneDrive and after logging in to my account I see all my files available to select from. I pick the Power Pivot workbook I would like to build a dashboard on and then click Connect. Once connected you’ll be notified as shown below that you can start working on your dashboard.


You’ll find three groups of objects here: Datasets, Reports and Dashboards.


Datasets are the connect to your model.  Selecting a dataset here will launch an empty Power View report with a connect to the dataset you selected.


These are basically like the new way of creating Power View reports. The development is completely done in the web browser and any visualizations you created here can be pinned as dashboard items. There’s a ton of new visualizations available here.  Again I’ll save that for a future blog post.


These are the completed dashboards that you’ve created.  Any Power View reporting object that you’ve pinned will appear here.

If you click on the new dataset that you just opened that will launch the new Power View reporting interface so you can explore your data. If you’ve worked in Power View before you’ll find there is not a huge learning gap for you. Here’s what the interface looks like.


You still select the fields you want to visualize but now the visualization options have been much improved.  Here’s a quick preview of a couple of the new types of visualizations



Tree Map


Filled Map




In addition to these you can also do combo charts!  Huge win with that one.  The dataset I used just didn’t have data to support showing that one to you.

If you want to pin one of these visualizations to a dashboard you simple hover over it and select pin to dashboard as shown below.


There are some things I really like about the new filtering capabilities but again I’ll save that for a future post. When your happy with the report click Save in the top ribbon and then you can access it again later.

Once you’ve saved you can return back to the main Power BI Dashboard pane and select the workbook that you uploaded under the Dashboard section.

When your dashboard opens you’ll notice two things are already present.  The first is Power BI Q&A is automatically available in every dashboard at the top.  The second is a connection to your workbook you uploaded is embedded into your dashboard.  This can be removed later but give you a quick way to launch Power View when you select it.


The way to create dashboard items is to create Power View reports and then pin the items (as shown earlier) you want to the dashboard. You can have different Power View reports all pin to a single dashboard.  Any items pinned on the dashboard can be selected to drillthrough to the Power View report it was based on. The Power View reports then serves as the detail view of the data and the dashboard is the high level overview of the data.

You can also pin the results of a Power BI Q&A natural language query.  If you use the Q&A search at the top of the dashboard you can pin the results to the dashboard as well.


This also allows you to pin visualizations of a simple number you want to monitor as shown below.


My final dashboard looks something like this. Using a combination of Power BI Q&A pinned results and Power View pinned visualizations.


Overall I love what I’m seeing with the new Power BI Dashboard capabilities and can’t wait to see how these reports are surfaced through things like mobile devices.

I’ll point out too I did find a few quirks here and there but just note it’s in preview still these things will be fixed before the official release I’m sure.

Preparing a Technical Session Part 1: Picking a Topic

So you’ve decided or perhaps were told to do a technical presentation. If this is something that’s new for you then you may be going through a variety of emotions. You may start excited in anticipation of the event but quickly that changes to anxiety when you realize all the work that’s ahead of you.

Delivering a presentation regardless of the subject matter can be a challenge. Even if you’re a seasoned speaker there’s several steps that lead to you completing a successful presentation.

In this blog series on preparing a technical session I’ll cover the following steps:

  1. Picking a Topic
  2. Coming up with a Title
  3. Writing an Abstract
  4. Building the PowerPoint
  5. Building the Demos
  6. Delivering the Presentation

While discussing these I’ll be sharing not only how I personally go through this process but also feedback I’ve gathered from peers. The good news is with more experience these steps will likely flow more naturally for you and with hopefully less stress.

My goal is to help guide those that are new to presenting through the process, help them understand what to expect and hopefully help grow a larger pool of speakers at events. 

Picking a Topic

When you submit to a major conference deciding the topic to focus on is your first step to getting started. This step is clearly critical because the idea that you come up with will impact the rest of your preparation. If you’re in need of a topic and have the equivalent to “writer’s block” then here are some tips to help you’re brainstorming process.

Talk About Your Passion

Have you ever been assigned a presentation topic that you’re not really passionate about? This may happen more in a corporate environment when you’re given a topic that just has to be covered with co-workers. When you’re not excited about a topic then it can often show in your preparation and delivery of the content.

If you are passionate about a topic then more likely write a more compelling abstract, be more proactive about content development and even deliver the information in a way that connects better with the audience. Now keep in mind some people may not be passionate about the same things you are so try not to be offended when others don’t share your excitement about a topic.

Present On What You Know

This seems obvious but you’d be surprised how many new speakers pick topics that are completely out of their comfort zone. If you’re a new presenter then this whole process may be foreign to you already so don’t add any extra pressure on your self to learn a completely new topic. Now, having said that I do see some experienced presenters occasionally pick topics that maybe aren’t necessarily completely new to them but are certainly going to challenge them to learn a few new skills. So in short if you’re new at this pick a topic you know well and if you’ve been doing this for a number of years then do what works for you!

Use Things You’ve Done At Work

Give yourself some credit. You’re smart and pretty good at what you do! I bet you’ve come up with some pretty inventive ways of solving problems while at work. Why not share some of the design patterns you’ve used to help others?

Don’t worry I’m not suggesting that do anything that would hurt your company and potentially cause you to lose your job. I bet any problem you experience at work are the same kind of problems that others are experiencing. Why can’t you take your solution and generalize the details, including the data, so it shouldn’t matter what your place of business is. The other benefit is these topics are often the most popular because they’re based on real world problem solving. My number one goal when I attend a session is to figure out how I can use what I just learned when I get back to the office. What better way to solve that then by showing problems you’ve actually solved at work.

Is Anyone Else Interested?

If you’re debating whether or not a topic would get much interest then ask? Take to social media with a poll of topics you’re thinking about presenting on and see what people like best. Not only are you getting valuable information back but you’re also doing a little early promoting for your session.

Journal Topic Ideas

Ideas can come at any moment. If you’re not prepared than you could have a stroke of brilliance and before you know if you’ve forgotten it. Be prepared and keep a pen and paper handy, or if you’re living in this decade sign up for Evernote or OneNote and log your topic ideas in a digital journal.

PASS Summit 2014 in Review

I’m finally completely recovered from a great week in Seattle last week for PASS Summit.  Now that it’s back to the grind of regular work I thought I’d put together some thoughts and tell you about my experiences from the week. 

You may have read others blogs about their experiences during the conference.  I always love how everyone may do very different things while at this conference but still have a great time!

My week started early as I arrived in Seattle on Saturday to prepare for MVP Summit that was going on at the same time.  There were some mixed opinions about these events going on at the same time but I liked it.  I think this helped many that are from out of the country justify coming to both events in a single trip rather then making two expensive and time consuming trips to Seattle.  MVP Summit started Sunday for SQL Server MVPs and ended a bit early for me, on Monday, because I delivered a Pre-Con on Tuesday.  Once our day in Redmond was done on Monday I went and registered for the PASS Summit in Seattle and called it an early night, knowing that I had a full day of talking during my Pre-Con coming up the next day.


This was a very full day as Brian Knight (Blog | Twitter) and myself taught a full day of SSIS: Problem, Design, Solution in our Pre-Con.  The Pre-Con was well attended with about 140 attendees.  I really enjoyed teaching and even attending sessions like these that focus on solving problems.  There’s many reasons why I attend PASS Summit but one of the biggest is that I want to learn things that I can immediately go use to help me solve problems when I get back to work.  I think we accomplished that with this session.  It’s was very focused on problems and different ways to solve them using SSIS.

Brian and I have done many presentations together over time but we’re always learning still.  One of the things we learned from feedback last year was to provide a PowerPoint deck with a little more substance.  PASS prints these decks for attendees to keep and take notes on.  While a single image slide with verbal discussion comes across great in a presentation it’s not effective when the slides are printed.  By the way, I still like the single image slide for presentations but not when you know the slides are printed for attendees.

Feedback on the session this year seemed to be pretty good but I’ll know for sure when the reviews are released.

Once the Pre-con was done I dropped my things off at the hotel and came back to the conference for the Welcome Reception.

Later that night I went to the Networking Party that was moved to Yard House.  I stayed and talked for a bit and meet some new people.  Unfortunately it looked like they weren’t ready to handle that volume of people at once so I ordered a meal from somewhere else to end my night.


On Wednesday I started my morning after breakfast by attending the Keynote feature T.K. “Ranga” Rengarajan, James Phillips, and Joseph Sirosh from Microsoft.  Definitely some interesting things announced during the this Keynote including Power BI Dashboard Preview coming soon.  You can watch the Keynote still on PASStv

Wednesday I had to deliver my session towards the end of the day so I wasn’t able to attend as many sessions as I would have liked, as I did some practice runs of my session. I did however attend Bradley Balls (Blog | Twitter) session on Using PowerShell to Manage Cloud Integrated Data Platforms. Brad always does a great job and shows his passion for the technology.  He showed many of the capabilities that PowerShell has for integrating into Azure.

Towards the end of the conference day I sat in the session that was prior to mine because I was a bit worried that there would be a lot of cross over with my topic because it was Power View related too.  Luckily there was very little that was the same.  I did my session on Creating and End to End Reporting Solution with Power View.  I think it went pretty well, even with Excel crashing on me.  I always have a back up ready :).  There was a lot of interest in Power View with SharePoint and how Multidimensional cubes interact with Power View so I saved some time towards the end for that discussion.

Immediately after my session I ran down to the Expo Hall for the Exhibitor Party where I was doing a book signing for Pragmatic Works.  We do this every year and it’s always a blast to meet so many people and give away free books!

Here’s a video Jeremy from our team who took showing the line that built up.

After the Exhibitor Party Pragmatic Works hosted our annual karaoke party at the Hard Rock with a live band.  I always have fun here and was able to catch up with friends.

B1vFDH2CAAIRg_u (1) 


I started my day by attending the Keynote by Dr Rimma Nehme. This was possibly one of the best Keynotes I’ve attended.  Her explanation of cloud computing was done in a way that I think my sales team would be comfortable with.  You can watch the Keynote still on PASStv.

You can watch the Keynote here or read Steve Jones’ (Blog | Twitter) summary here.

In between networking with a few new folks I also attended two session: The BI Power Hour and Top Five Power Query M Functions That You Don’t Know.

The BI Power Hour is always a fun “infotainment” type session with many of the members of the Power BI product team.

I also really enjoyed the Top Five Power Query M Functions session.  I’ve always really liked Power Query but Reza Rad (Blog | Twitter) showed me a few new tricks!

I ended my day by attending the PASS Community Appreciation Party at the EMP (Experience Music Project) Museum.  This place is always fun and of course they had live band lead karaoke.


This was my day to work the Pragmatic Works booth so my morning was eaten up by that. To end the day I sat in on the Speaker Idol competition. The winner of the last round would be guaranteed a session at PASS Summit 2015. I was curious to watch this because we do a version of this locally for the user group in Jacksonville. Here’s Jason Carter (Blog | Twitter) doing his presentation

Embedded image permalink

I had a late flight home and actually didn’t end up back in Jacksonville until 11 AM Saturday morning… Long day!

If you’ve never been to a PASS Summit before I highly recommend it.  It is an event run by the community, which makes it unique.  While there are sponsorships and partnerships it’s community run so you hear from and meet a lot of people that are going through the same things you are.  Next years PASS Summit will be Oct 27-30. I hope to see you there!

Creating a Reporting Services Subscription

Subscriptions are a great feature in Reporting Services that will run a report unattended and deliver it to users either by email or Windows File Share. Subscriptions can also be scheduled so if your end users need a report monthly you can automate the delivery process. The steps given in this article will show you how to configure subscriptions using the Windows File Share delivery method.

Before you start to configure your first subscription make sure to have the following taken care of first:

Start the SQL Server Agent

When you installed Reporting Services it also created the ReportServer database. This database stores all subscription related data among other things. Ensure the SQL Server Agent is running on the database engine that has the ReportServer database.

To start the SQL Server Agent you can either login to the database engine in SQL Server Management Studio or open the SQL Server Configuration Manager then right-click and select start on the SQL Server Agent.

SQL Server Configuration Manager SQL Server Management Studio
1 2

Enabling sharing on folder delivering to

If you have decided to deliver reports via Windows File Share then you must configure the folder that will accept the reports for sharing. Navigate to the folder that you want to be used with the Subscription and right-click on it then select Properties. Go to the Sharing property tab to enable sharing.


Use SQL Server Authentication on data sources used in report

To enable Subscriptions on a report you must have a SQL Server Authenticated account used to access the data sources. In SQL Server Management Studio, create a new SQL login that has db_datareader access to the databases used in the report data sources.


After you have created this account change the report data sources to make sure it is being used in the Report Manager.


Now that these prerequisites have been taken care you can actually create a new Subscription. Click on the report that you want to add a Subscription to and click New Subscription in the report toolbar. If you do not see the New Subscription option you may need to have your privileges elevated.


There are several options you have when creating a Subscription:

Delivered by – Can either be Windows File Share or Email. If you have not setup your SMTP server to send email then your only option is Window File Share. The SMTP configuration is setup in the Reporting Services Configuration Manager.

File Name – What you want the file that is created to be called. Also, there is a checkbox for “Add a file extension when the file is created” which is checked by default. There may be some special circumstances where you do not want the file extension, like your work email does not accept attachments with an .xml extension.

Path – The file path where the file should be sent to. It must be written as the full UNC path (\\ServerName\FolderName). For example, my path would be \\Devin-PC\Reports.

Render Format – How you want the format to be rendered. Your options are XML file with report data, CSV (comma delimited), Acrobat (PDF) file, HTML 4.0, MHTML (web archive), Excel, RPL renderer, TIFF file, or Word.

Credentials used to access the file share – This must be a login that you setup to have access the file share in the eariler prerequisites.

Overwrite options – Overwrite an existing file with a newer version is set by default and will always replace a file if they are named the same. Do not overwrite the file if a previous version exists will keep the original file and cause the subscription to fail. Increment file names as newer versions are added will continue to add a new file if one already exists with the same name but will increment it (ReportName_1.pdf, ReportName_2.pdf, ReportName_3.pdf).


You can also click Select Schedule to design a set time for the Subsciption to run. You can also configure when you want the schedule to start and how long it should continue. Once you have set the schedule you want click OK.


This will return you to the previous screen where can lastly decide to pass in different values to the report parameters before you finalize the Subsciption by clicking OK.

On the Subscription tab of the report you should see listed all the Subscriptions that have been created for this report.


When the Subscription runs you should see the file now in your shared folder. If you open Management Studio and take a look at the SQL Agent jobs you will see there is a new one for the Subscription you created. Here you can run the Subscription manually so you do not have to wait for the scheduled time.


WARNING: Do not change any settings of this job. Even just changing the name could cause the Subscription to fail.

Optimizing Power BI Q&A with Synonyms and Phrasing using Cloud Modeling

If you’ve have used or even just seen a demo of Power BI Q&A you’ve likely seen there’s great potential in the feature for even low-tech user. I usually say, “If the user knows how to use a search engine then they can interact with Q&A.” You can read the basics of how Power BI Q&A works from my previous post here.

While having this capability is great it can take some fine tuning to perfect for users interacting with it. Power BI provides you with the ability to now optimize your models for Q&A and it can all be done directly from the Power BI site with what Microsoft is calling Cloud Modeling.

Cloud Modeling gives you the ability to add synonyms and phrasings to your Power Pivot workbook from the web interface in Power BI sites.  Let’s looks at a couple scenarios that show why synonyms and phrasings are necessary and how these features cans solve the problem.

The Problem

Let’s discuss the problem through a simple example.


This data model is designed to show US Presidential Election data. When this is added to Power BI Q&A users will likely ask questions like “How many votes by party and state” or “Which candidate won the election”.

Data consumers that interact with Power BI Q&A will often ask questions that do not correlate to exact table or column names that are actually in your Power Pivot data model. Then sometimes users will use linguistic terms that are difficult for a computer to comprehend to return back data.

For example, take the sample questions I gave a moment ago.  Let’s start with the question “How many votes by party and state”. The issue with this question is that it references things like votes and party. Looking at the data model you’ll find that none of these columns exist. So asking this question will return no results.

The question “Which candidate won the election” has a different issue to address. In this case we’re using a verb ‘won’ to describe a relationship between a candidate and an election. Because this verb appears no where in the data model we would get zero results returned. 

Now let’s look at how these problems can be solved with synonyms and phrasings.

The Solution


To solve our first problem with the question “How many votes by party and state” we would need to create synonyms on the appropriate columns in your data model. Synonyms are like aliases that can be created for both tables and columns inside your Power Pivot data model.

You should create synonyms on your tables and columns for the different ways that people would ask for your data. For example, in my question “How many votes by party and state” I would need to create the following synonyms in my data model:


Power Pivot Model Name Synonym to Create
Total Popular Vote Votes
Party Name Party

There are two locations that you can create synonyms on your model. You can either launch the Excel desktop client (Office 365 only) and make the change in the Power Pivot model or you can do it using the Cloud Modeling approach on the Power BI site under the Power BI Site Settings.

Synonyms in Excel

Let’s look first at the Excel approach.  When looking at your Power Pivot model in Excel you can add synonyms from the Advanced table by clicking the Synonyms button.  This button only appears if you are using the Office 365 version of Excel.


This will launch the diagram view and a Synonyms pane.  Select the table you wish to create your alias on and then type the synonyms for either the table or columns you wish.


Once you’ve done this the next time you deploy your changes to Power BI you can ask questions that utilize the synonyms you’ve created.

Synonyms in Cloud Modeling

While synonyms inside of Excel is nice I honestly could see people more often defining synonyms using the Cloud Modeling approach. To use this approach you must first deploy a workbook to Power BI and enable it for Q&A.  I discussed how to do these steps in the previous post here.

Once the workbook is in Power BI you’ll go to the Power BI Site Setting page as shown below.  This is also where you will go to configure Phrasing, that we’ll discuss later.


Select Q&A.


Then find your workbook and click the ellipsis next to it.  Select Optimize for Q&A to launch the Cloud Modeling window.


You can use this to ask questions using Q&A but also enhance the model using the Cloud Modeling options on the right pane. Notice here I asked my question “How many votes by party and state” but did not get any questions. In fact, some of my question “How many votes…” is grayed out because Q&A was not able to find anything in my Power Pivot model that matched this name.

I know I have data for votes in my model but the original column likely had a different name. To fix this I can add a synonym by click the Synonyms tab in the Cloud Modeling pane.image

Just like we saw earlier in Excel you select the column that you wish to create a Synonym for and type a comma separated list of values you would like as aliases.  You notice here as soon as you type in the synonyms and click away Power BI Q&A is not able to answer the question.

image  When you’re happy with the synonyms you have created you will need to hit Save to send the changes back into the Power Pivot workbook.


The other problem we discussed earlier was with extra words we may use to describe the relationship between things. Take the example from earlier: “Which candidate won the election”. The problem is how do we define what it means for someone to ‘won’ and election.

Creating synonyms would not help solve this problem because it is not simply an alias for something else. In this case the term ‘won’ is used when talking about our data to define a relationship between the two entities: candidate and election. To solve this problem we must create a Phrasing.

Phrasing can only be done from the Cloud Modeling pane (Not Excel) on the Power BI site that we just looked at in the previous example. So from the Cloud Modeling pane I’ll click Phrasing and ask the question ”Which candidate won the election”.

This does return results but you’ll notice that the word ‘won’ is grayed out meaning Power BI Q&A can’t figure out what do do with it. So the results are just showing me all candidates that have values in my election table.


We need to add a Phrasing to define what ‘won’ means. In the Cloud Modeling pane I click Add Phrasing and type “Candidates win elections” then click OK. This helps and defines the relationship but to take it a step further I can click on Show Advanced Options to define the threshold for a ‘win’.

To win a US presidential election a candidate must have greater than 270 electoral votes (sorry non-US citizens no time to explain the dynamics of our election process). So I setup a condition defining that for someone to ‘win’ an election they must have more than 270 electoral votes.


This changes my results to only show candidates who have won elections.  To verify this is correct I change my question to “Which candidate won the election in 2008” and as you would expect the only results that are returned is Barack Obama


With these changes made I would click Save to ensure these changes are pushed back to the workbook. You’ll also notice there’s an option here to export or import the configuration you turn on here. Perhaps you could use this to create your Synonyms and Phrasing on a development site and import them into production.

With these changes implemented your users should be able to ask typical questions of the data without you worrying about poor results.  Hope you enjoyed this tour of Cloud Modeling.

Analysis Services Partitions

Partitions are a great feature in Analysis Services that allow you to split measure groups in to separate physical files. All measure groups by default have one partition but by splitting that partition you will gain improved query and processing performance.

Partitions can be split in any way you see fit. Many people choose to separate measure group data by date. This makes sense because just about every fact table is going to have a date dimension associated with it. It also makes for a clear way to draw a line in the sand where each partition can be separated. For example, your company has three years worth of sales data so you decide to split the sales measure group into three partitions, one for each year. Depending on the sales volume you may decide to split it even further down to each quarter or even daily.

Multiple partitions can only be deployed to a server running SQL Server enterprise edition. However, if you are developing on a machine that uses standard but will be deploying to enterprise then you can impersonate developing in enterprise by right-clicking on the project file in the Solution Explorer and clicking Properties. Here you can change to the edition of SQL Server you are deploying to and this will also change the restrictions in BIDS on what you can and cannot do.


So why do you want to add partitions to your cube? Well the leading reason is to increase performance. So why does adding partitions actually help performance? It’s going to help in several ways.

Query Performance

Query performance will be increased because rather than querying an entire measure group Analysis Services can isolate a single partition to search. It can also search multiple partitions in parallel if need be.

Processing Performance

Partitions will also increase processing performance. Processing performance is optimized for the same reason it was while querying with partitions. Instead of processing the entire measure group you can just process your most current data skipping over your partitions that store data that is years old and doesn’t need to be reprocessed. Analysis Services will also allow you to process partitions in parallel. When processing the cube you can


My recommendation for setting up partitions is to not let a measure group exceed 50 million rows before creating a second partition. This is very subjective though and 50 million rows is the maximum you would ever want to see a partition. There is nothing wrong with creating a partition before you reach this number. Again, this is very subjective you could decide to create a new partition after one million rows. I also generally recommend using the default MOLAP storage mode.


Here are the very basic differences between the different storage modes:

Storage Mode Description

· Data and aggregations are stored in multi-dimensional format. Makes for slower processing time but faster querying time.

· Cube must be reprocessed to get updated data.

Scheduled MOLAP

· Same as MOLAP but cube refreshes every 24 hours

Automatic MOLAP · Same as MOLAP but updates in the relational database raise events that trigger cube refresh.
Medium Latency MOLAP · Same as MOLAP but updates in the relational database will trigger a switch to Real-time ROLAP while cube is processing. When cube completes processing returns back to MOLAP. Default latency is 4 hours.
Low Latency MOLAP · Same as Medium Latency. Default latency is 30 minutes.
Real-time HOLAP · Data is stored in the relational data source but aggregations are stored in multi-dimensional format. Fast for processing but slow querying.
Real-time ROLAP · Data and aggregations are stored in multi-dimensional format. Fastest for processing but very slow querying.
· No data latency.


Now that you know what partitions are and why they are helpful let’s go through the steps of splitting a measure group into two new partitions. In this example you want to place internet sales that occur before 2003 into a partition called Internet Sales Old and everything after that date should go into a partition called Internet Sales New.

Step One – Restrict Rows to Original Partition

Remember all measure groups have at least one partition by default. Before creating a new partition you must first change the old partition to restrict what rows are brought back. If you don’t do this before trying to create a new partition you will get the following warning:


Click on the Source for the original partition so you can restrict the rows that it stores.


· Change the Binding type from Table Binding to Query Binding. You could leave this as Table Binding if you separate your measure group data into multiple tables or views.

· When you change this property to Query Binding you will see that it automatically provides you the query that will return back the table with a blank WHERE clause at the end. Remember we want to have a partition with all data prior to 2003 so the WHERE clause needs to be changed to only return data prior to that date (Ex. WHERE OrderDateKey <= 20021231). Once the query has been changed hit OK.


Step Two – Create a Second Partition

Rows are now being restricted from your first partition so you can now create a new one.

· Select New Partition under the measure group that is ready for a second partition and hit Next to start the Wizard.


· Check the table(s) used for this measure group that should be used in this partition then hit Next.

· Check the box the reads Specify a query to restrict rows and add to the WHERE clause like you did in step one to bring back all the dates after 2002 (WHERE OrderDateKey > 20021231). Be very careful when writing these WHERE clauses. If you accidently did >= instead on > then there would be overlapping sales data for 20021231. I could also accidently exclude a day if on the first partition I used < and the second partition >. This would exclude one days worth of data from my measure group. On the bottom of the dialog box you will see a warning describing the possibility of overlap and missing days. Hit Next when you finish typing the query.


· You can select a storage location other then the default if you would like then hit Next.

· On the last screen you can give the partition a name like Internet Sales Old and decide whether you want to design an aggregation now or later. I will write a second article explaining aggregations so select Design aggregations later. After you have named the partition hit Finish.

· Rename the original partition Internet Sales Old


Now you can make optional change to Storage Settings that were discussed earlier in the article under the recommendations section. You may also find an option called Enable Proactive Caching in the Storage Settings, which is an option that will also be discussed in a future article. This is a great first step to Performance Tuning your cube.


Get every new post delivered to your Inbox.

Join 47 other followers