Jul 272009
 

 

The reason I’m discussing this often mentioned topic is the following. After many hours spent on the analysis how do dynamic sets behave in SSAS front-ends, I’ve been asked a perfectly reasonable question: Why do we have dynamic set and when should it be used? I believe this article explains the story behind dynamic sets and their purpose, as well as when and how they should be used. It also warns the reader that the problem is not about the implementation of dynamic sets in SSAS, but about something else. What? You’ll see when you read it through.

INTRODUCING DYNAMIC SETS

SSAS 2005 and 2008 feature MDX script as a place to provide calculations. The reason is obvious – concentration of calculations on a server results in better query performance (precomputations, cache, etc.) and easier maintenance.

SSAS 2005 MDX script featured cube-based calculated members/sets support. While cube-based calculated members were equivalent in behavior to their query-based counterparts (except for some newly available internal optimization possibilities), cube-bases calculated sets were not. They were useful, but many times that was not enough. They didn’t behave exactly the same way query-based sets did and, for many, this was a big issue, because they couldn’t provide the same logic in cube as in query. In order to comply to requests, SSAS team introduced new kind of sets in its next version of SSAS – dynamic sets.

Dynamic sets, unlike static sets, are fully equivalent to calculated sets defined in a query. Which means BI developers were finally able to achieve the same functionality in MDX script as once in query.

It’s important to notice that sets in SSAS 2005 have not been converted to dynamic, once implemented in SSAS 2008. Today, both versions of sets exist, both are useful. One variant was named STATIC, the other DYNAMIC. And it’s solely up to a BI developer to prefix them appropriately in their definition. Default are static sets (for historic reasons, 2005->2008 compatibility), which means prefix can be ommited.

DESIGN AND USAGE

The rule for usage was simple. If a BI developer needs a hardcoded collection of members, which doesn’t change no matter what in query/pivot, he/she should use static sets. In all other scenarios, a dynamic set can be used. So, how come there are problems here and there using them?

We said that query-based sets should have their cube-based equivalent. And they do, at least in SSAS 2008. When we implement cube-based sets, we enjoy the comfort of being able to refer to them in any query. Equally so, end-users are able to use them in a pivot table (SSAS front-ends) which is an extra plus. That is unless we hide them, which rarely is the case. Part of our intention is to provide those sets directly to end-users, not just to be able to refer to them in our queries.

Dynamic sets are context-sensitive. Not row, not column, just slicer (and subselect) sensitive. Which means that end-users can experiment by slicing them in a pivot table and see how they change. A typical example is “Top 10 something” set. When defined as dynamic in a cube, it should cover a whole range of scenarios for end-users. Whether sliced by additional attributes or different dimensions, that set should adjust accordingly. A dream come through for any BI developer – a cube-based solution covering many reports. Because the best solutions are those covered by design. Just like a pivot table on top of an OLAP cube, a solution-by-design, covering endless T-SQL queries in a relational world. However, not everything wen’t smoothly, at least not on the other side …

SUPPORTING DYNAMIC SETS

Queries are written by BI developers, not end-users. Which means they and their sets are under tight control, hidden and preferably tested enough. Pivot-table in front-ends are not. In contrast, they are fully exposed to end-users and therefore often limited, not just regarding cube-based sets, but other features as well. Why? It’s simple, the more you restrict your front-end, the less worries you have later, knowing that potential end-user actions and their combinations are beyond your imagination. One of the common example is hiding MDX – you play it safe with your pivot and features you support in it.

Each front-end tries to keep things under control by balancing between end-users requests and features it would rather skip than support. In a multidimensional world possibilities are endless and therefore – once you had decided to support a SSAS feature, you’ve opened a Pandora’s box whether you know it or not, because everything’s related and all features should fit together just fine.

Every front-end has these dilemmas – do we support this feature, if so in what scope, in what situations, what do we support as additional actions to it, what combinations of existing features are supported with it, etc. (our CubePlayer is no exception there). It’s hard to implement everything properly, to foreseen things. I’m not saying impossible, but, let say experience plays a big role here.

Things went wrong when little by little it became obvious that the favorite front-end of many (also a relatively young one, so we shouldn’t be too harsh on it) had a problem with dynamic sets. Cause of the problem? Intense use of subselect.

When an end-user placed a set on rows (or columns), a peculiar syntax was formed and that same set also ended in subselect.

By now you’ve must have guessed – we’re talking about Excel 2007.

HOUSTON, WE’VE HAD A PROBLEM

Each front-end generates one or more MDX statements in the backgroud for every end-user action in its pivot (almost every action). Regardless of areas and their names in the front-end’s pivot, which is up to a particular software vendor, there are rows, columns, slicer and subselect areas as 4 basic parts of every MDX query (some of them can be skipped, though).

MDX query is the thing that is being sent to SSAS server, so it up to the front-end to properly convert what the end user provided in pivot into a correct MDX. The problem is, there can be several solutions to it. It’s all about how to pick the best solution, the one that will work not just in this case, but also with everything we support in our front-end (which brings us back to front-end dilemma). Using the subselect too often as an approach was not a good decision, because dynamic sets, if nothing else, were obviously negleckted. Or not tested properly. The artificial Adventure Works database, a reference for many things, may be misleading (see this SSAS forum thread for more info).

Two of those areas, subselect and slicer, are of importance here. They are similar in nature but also quite different. In order not to loose focus here, we will only state what’s relevant for dynamic sets – the subselect is evaluated before the slicer!

SSAS server can be simplified like this. Internally, first it ask what do we have in subselect and then evaluates it. It is important to state here that this evaluation is performed in the context of cube dimensions default members (or another subselect inside it, if they are cascaded, but let’s keep things simple). Then the slicer gets evaluated. Then the axes. Then cells, either by block-mode, or cell-by-cell. Finally, NON EMPTY and HAVING operations are performed on axes.

Our set is supposed to be on an axis, where the end-user has placed it. That way it would be evaluated after the slicer, adjusting accordingly. Excel 2007 on the other hand placed it in the subselect. It can’t be evaluated against the slicer because the slicer doesn’t exist yet (it’s like knowing the future). But what does exist?

Default members! Dynamic set placed in subselect is evaluated against cube hierarchies default members. And what is static set evaluated against, in MDX script? Also default members. Which makes dynamic and static sets the same in Excel 2007, both static in nature.

CONCLUSION

Dynamic sets are implemented just fine in SSAS 2008. That is, they are equal to query-based sets, adjusting to slicer promptly.

It is up to the front-end you’re going to use on them and its support towards them. That’s where the problems might occur, either because of lack of support, or worse, because of improper support.

Finally, it’s up to us, consultants, to educate ourselves properly and then communicate with potential users, advising them what is supported and what is not in solutions we intend to implement. The final decision, of course, is client’s.

Jul 242009
 

 

Histograms are an important tool in data analysis and as such are used in a variety of fields: digital photography, databases, business, and so on. They represent the distribution of frequencies of an entity or event.

By its design, OLAP handles entities much better than events and the same is with histograms – entity-based histograms are supported easily, event-based not. Yet, just like there was a way to count “in-flight” events using Mosha’s solution and handle begin and end times using the approach of Richard Tkachuk and Thomas Kejser, there is certainly a way how to score one more time over events in OLAP, this time by enabling event-based histograms. And here’s how …

In order to use histograms, the first requirement is to have DistinctCount measure, either physical, defined in a separate measure group, or calculated, defined in MDX script. The first approach is faster, although requires extra space.

The second requirement is to have an entity (dimension), preferably with some attributes. The third is to have a fact table (cube).

All in all, nothing special except DistinctCount measure, one per each entity of interest (customers, products, …). A fact and a dimension with its attributes is something we can expect to find in every cube.

Take Adventure Works 2008, for example. It meets all 3 requirements. Therefore, we’ll use the Standard version of it as our example here (which can be downloaded from Codeplex in case you don’t have it yet). But let’s first return to the definition we used for histogram, to analyze the difference between two types of histogram.

We said there are entity-based frequency distribution and event-based frequency distribution.

When we say entity, we’re actually thinking about an attribute of an entity, like shoe size, weight, height, number of children, maritual status, and so on. When we say event, we’re thinking about occurrence of an entity (or its attribute, since they are related) in a series of events. And series of events is another name for fact table.

Adventure Works 2008 features Customer Count measure, which can be found in Internet Customers measure group. By using that measure over any attribute of Customer dimension (particulary those found in Demographic folder) we get entity-based histograms. Those histograms show us how many customers are single and how many are married, how many earn 10-30K$ and how many 130-170K$, and so on. If the histogram has higher values on the left side in both cases, we can conclude there are more single than married persons in our pool and more of those earning less than average. Of course, in another pool (slice per customer city or some other attribute) the situation might be different.

The important thing to notice about entity-based histogram is this – they can only be sliced by additional attributes of that entity (dimension), not by other dimensions or their attributes. Which means this type of histogram is static, time-irrelevant, entity-oriented.

The other type of histogram is therefore dynamic, time-sensitive, fact-oriented. Actually, any-dimension-sensitive, since it is fact-oriented. An example of such histogram is this: count of customers on y-axis, distinct number of orders on x-axis. In other words, we have 0, 1, 2, and so on on x-axis and our DistinctCount measure on y-axis. With one problem – numerical values (measures) seems to be on both axis. And that’s something OLAP doesn’t support.

0, 1, 2 and so on represent distinct number of orders. Although it looks like they should be calculated in MDX using some count expression, we can not do that. Our task is to display a table/graph. That means we need to iterate over some dimension (rows) and provide the values (measure) across it (columns). We’re talking about 2D area, a grid.

Although it is possible to generate N calculated measures named as 0, 1, 2 and so on and to provide a proper calculation for each of them, there are many obstacles in that approach. First, there can be many bands/intervals and MDX doesn’t support loops in its DDL (can’t generate calculated measures in a loop, only one by one explicitely). Which would require a lot of manual work. Secondly, new intervals might appear over time. Some customers might score more than the designer of those calculated measures have planned in advance.

All that leads to a search for another approach. And there is one.

A better solution would be to implement the distinct number of orders as a new dimension, a dummy dimension. Which is similar to shell/utility dimension (for those of you who used it sometime), but quite different also.

Normally, you make shell/utility dimension with one member only and many calculated ones. Or, you do it with one member and many dummy members, in order to avoid problems with calculated members in some tools :-).

Here we’ll have all regular members, no calculated ones, and we’ll have N+1 of them, where N is the maximum number of orders. All natural numbers, including zero as the first record (that’s +1). In contrast to shell/utility dimension, our dummy dimension won’t have any calculations on its members. Hence we call it “dummy.”

We’ll start by creating a named query in DSV, naming it “Reseller Order Histogram” (don’t worry about the pop-up with an error, just click OK):

Create new Named Query in DSV

SELECT
    0 AS Interval

UNION ALL

SELECT
    TOP
    (
        SELECT
            TOP 1
            Frequency AS Interval
        FROM
            (
            SELECT
                ResellerKey,
                COUNT(SalesOrderNumber) AS Frequency
            FROM
                ( SELECT
                      Distinct
                      ResellerKey,
                      SalesOrderNumber
                  FROM
                      dbo.FactResellerSales
               ) AS t1
        GROUP BY
            ResellerKey
             ) AS t2
        ORDER BY
            Interval DESC
    )
    ROW_NUMBER() OVER(ORDER BY ResellerKey) AS Interval
FROM
    dbo.FactResellerSales

That T-SQL gives us N+1 rows where N is the maximum number of orders made by any reseller in a fact. In Adventure Works 2008, that’s 12.

It’s a bit complex query, but if you execute inner parts of it one by one, separately, you’ll understand what’s going on. Also, I’m not expert on T-SQL, perhaps it could be optimized :-).

Then we’ll build a simple dimension “Reseller Orders Histogram”, and set the name and value columns to that same Interval field (see the picture below).

Reseller Orders Histogram dimension

We’ll process full that dimension, then add it to Adventure Works cube without linking it to any measure group. Finally, we’ll add this part of code below at the end of MDX script in order to create calculation (new measures) that we’ll use in combination with Reseller Orders Histogram dimension:

Create MEMBER CurrentCube.[Measures].[Reseller Orders Frequency] AS
     Sum( EXISTING [Reseller].[Reseller].[Reseller].MEMBERS,
          iif( [Reseller Orders Histogram].[Interval]
              
.CurrentMember.MemberValue =
               [Measures].[Reseller Order Count],
               1,
               null
             )
        )
, Format_String = ‘#,#’
, Associated_Measure_Group = ‘Reseller Orders’
, Display_Folder = ‘Histogram’
;

Scope( ( [Measures].[Reseller Orders Frequency],
         Root( [Reseller Orders Histogram] ) ) );
    This = Sum( [Reseller Orders Histogram].[Interval].[Interval].MEMBERS,
                [Measures].[Reseller Orders Frequency]
              );
End Scope;

Create MEMBER CurrentCube.[Measures].[Reseller Orders Frequency Total] AS
     ( [Measures].[Reseller Orders Frequency],
       [Reseller Orders Histogram].[Interval].DefaultMember )
, Visible = 0
, Format_String = ‘#,#’
, Associated_Measure_Group = ‘Reseller Orders’
, Display_Folder = ‘Histogram’
;

Create MEMBER CurrentCube.[Measures].[Reseller Orders Frequency %] AS
    iif( [Measures].[Reseller Orders Frequency Total] = 0,
         null,
         [Measures].[Reseller Orders Frequency] /
         [Measures].[Reseller Orders Frequency Total]
       )
, Format_String = ‘Percent’
, Associated_Measure_Group = ‘Reseller Orders’
, Display_Folder = ‘Histogram’
;

After deploying the cube, we can test it in cube browser (by reconnecting first) in order to see changes.

Open measures, then measure group Reseller Orders, then folder Histogram. Drop both measures in data area. Drop Reseller Orders Histogram dimension on rows. See how the values populate the grid. Then test by adding years from Date dimension on columns. See how total stays the same (unless some years are unchecked) while individual years have their own data. You should know that you might experience some delays, depending on what you bring in pivot (don’t exaggerate).

Using Histogram dimension in cube browser

That’s it. You made event-based histogram. Hope you liked it :-).

In the end, it’s worth saying that this solution can be reduced to a fixed number of intervals (to speed up the results or to have better, smaller graphs in cases when there are many intervals, i.e. > 50). One way would be to keep the first intervals, while grouping the last ones: 0, 1, 2, … 25, 25+. Variant of it would be to group on both sides, in case all data focuses in middle. Finally, the third option would be to reduce the resolution of x-axis (sampling, making bands) by combining several adjecent intervals to one: 0, 1-10, 11-20, 21-30, etc.. Keep in mind that SQL and MDX calculations should be adjusted in each of these scenarios.

Final note: I deliberately switched the axes when I made cube browser example. It’s because data can be seen much better this way (otherwise it spans to much on the right). Normally, when you design a graph for it, you should use bar chart, preferably oriented vertically, as in screenshot below.

Histogram in CubePlayer (bar chart)

Jun 162009
 

 

Universal calculations in MDX was the title of my session at KulenDayz 2009 conference, held in Beli Manastir, Croatia, on 13th of June, 2009.

Presentation, MDX queries and SSAS projects (based on Adventure Works 2008 sample) of this adapted version of the presentation held in Manchester this year can be downloaded here.

This revision is a small step towards my goal of writing a paper on this subject. But, that has in the meantime become a low priority task and is therefore postponed.

Jun 092009
 

 

One year ago, I discovered Microsoft MSDN SSAS forum, registered as a new user and started testing myself by answering other users questions. Pretty soon I started to like it and kept coming quite often.

Since SSAS is my specialty, I had many correct answers, which positioned me among top answerers right from the begining (and kept there throughout the year which I had also set myself as a goal and publicly announced inside the company). Other stats for the period are as such: 1000+ posts, 200+ answers, 4500+ points.

I’m satisfied with that score and the goal achieved, but far more with two other things: I helped a lot, I learned a lot. That’s because I like to help/teach, I like to learn/be thought and I like switching it from time to time. It enriches me. Besides, it’s in my nature (which a careful follower of my SSAS posts have surely noticed by now, if not for anything else, then for the length of my posts :-)).

My final conclusion is that the activity in forum was a wonderful experience. I recommend it to everyone.

 

In the following year I’ll continue to contribute to MSDN forum, though in a bit reduced way. I plan to engage more in writing blog posts and articles for magazines, learning new things, passing exams, presenting and organizing courses (besides my regular job, of course). And that takes time.

What will come out of it, we’ll see in a year from now review. The adventurous spirit is ready for the sequel …

Apr 222009
 
The company I work for, SoftPro Tetral, is one of exhibitors at European PASS Conference 2009.
 
Miljenko Akik (CEO) and I will run live demo of CubePlayer on our two laptops and additional LCDs for the audience. In case you want to give it a try, ask a question, see new version of our SSAS front-end in action or just meet us, be sure to stop by.
 
See you.
 
 
Apr 012009
 

 

“I made a trojan in MDX script!”

Although I deliberately timed this post to be published on my blog on 1st of April, this is far from a joke. And that trojan is a good one, for the benefit of all, as you’ll see soon.

While preparing my presentation about the potential use of Axis() function in calculated measures, I came up to a very interesting discovery: I wrote an MDX expression that allows the use of Axis() function inside dynamic sets in MDX script! Since the use of Axis() function is not supported in dynamic sets, I named the expression Trojan horse for dynamic sets, because of how it tricks the engine and makes Axis() function pass through.

Lets say a few about dynamic sets. Their name suggest they are evaluated dynamically. Furthermore, MSDN definition says they are evaluated everytime they are used in a query. But all that is simply not so. They are actually less powerful (about what I just googled one post from Chris Webb) because they get evaluated only once, before evaluation of cells (where all the fun is usually happening).

Dynamic sets are evaluated much sooner in query execution phases – after (which means in the context of) slicer and subselect. And that brings us to a final conclusion. As opposed to static sets that are static all the time, dynamic sets are just fine-tuned according to slicer and subselect of a query, but are pretty much static when it comes to evaluation of cells on the intersection of axis. In short, they behave just like query defined sets. And I consider them static. Get the idea of problematic name?

I don’t know where I’ve read it, so I made a short search on the net in order to present some proof for that. The best article on that subject that I come upon in a relatively short time is one older post from Mosha (of course). Since I haven’t read it yet and it looks very related to the thing I’m about to show you, I’ll read it now in order not to repeat things here.

Ok, I read it. That thing about subselect is nice.

Now let’s get back to what I ment to present you.

While being active on MSDN SSAS forum, I came up to this thread, where Bryan C. Smith nicely pointed out to me that my advice about the use of Axis() function simply doesn’t hold since it’s not allowed in MDX script. I played with it for a while, but encountered some problems, so I quit and concured with him.

But, being persistant (not to say stubborn) as I am, I took another chance as soon as I saw the first opportunity for that, a time when I was deeply into that issue – preparing a session about Axis() function. And I succeeded! Yes!

If you need a reason for it, I mean, reason why I wanted to do it, just take a look at that thread – making dynamic sets evaluate according to a measure placed on the opposite axis. Isn’t that great possibility? I mean, people ask for it, yet it isn’t supported. So, what do we do? Wait for the next release, for the new Extra Dynamic Sets? Yeah, right.

Why not think of something magical and use it now. This same moment. Why wait? All we need is some kind of expression that will evaluate differently when in MDX script as opposed to query context. But how? What differs those two environment?

When query executes, there are axis. When MDX script is evaluated, we have default measure in context. And iif function is optimized in various program languages not to evaluate on both paths, for performance reasons. Could it be my gate for trojan? Can I somehow make a trojan horse in MDX script that will pass the Axis() function hidden inside various iifs or other constructs and at the same time trigger properly when in context of a query?

Yes! It can be done. Here it is.

Create Dynamic Set CurrentCube.[Query measures evaluated] As
— aka Trojan for Dynamic Sets
        iif(Measures.CurrentMember Is Measures.DefaultMember,
            iif(IsError(Axis(1).Count),
   
            iif(IsError(Axis(0).Count),
                    {Measures.CurrentMember},
                    iif(IsError(Extract(Axis(0), Measures).Count),
                        {Measures.CurrentMember},
                        Extract(Axis(0), Measures)
                        )
   
               ),
                iif(IsError(Extract(Axis(1), Measures).Count),
                    {Measures.CurrentMember},
                    Extract(Axis(1), Measures)
                   )
               ),
            {Measures.CurrentMember}
           )
      , Display_Folder = ‘Universal dynamic sets’;

Although it referes to Axis() function, this dynamic set evaluates correctly inside MDX script, yet returns a set of measures that are relevant for the query being executed when triggered from that same query. What does that mean?

If the query doesn’t contain any measure, a default one is returned. If the query contains a measure on slicer, that measure is returned. If a query contains measure in subselect, (only) first measure inside is returned (a default for that subcube). Here’s a possibility for improvement, for eager ones. If a query contains measures on columns, they are returned. If the query contains measures on rows, they are returned. All in all, it collects measures from query, wherever they are.

How is it achieved?

First, let’s get one thing straight. Everything you write in MDX script of a cube needs to be evaluated and checked for errors before it is deployed. Meaning, if you naively reference Axis() inside it, withouth cloaking it properly, you’ll see an error that it is not allowed to use the Axis() function and you won’t be able to deploy. So we need to cloak it somehow. And here’s how.

When that expression is evaluated/checked in MDX script, the first condition (always) evaluates as False. That’s the main trick. Why? The CurrentMember doesn’t exist yet, it exists only in query! Which means that the whole expression results as an empty set ({Measures.CurrentMember} in False part evaluates to null set). SSAS engine doesn’t check the construct in True part (performance optimization, luckily for us) and treats it as a deterministic expression – an empty set. In other words, this {}.

Everything’s clean and fine. We’re in (whisper men in the horse)!

Now, we could have put anything that will evaluate to False in the first condition but we didn’t. We carefully constructed that condition, because we need it to evaluate differently once it is called from a query. CurrentMember is a valid expression in the context of query. On the other hand, current member might or might not be default member. And that means we got ourselves a perfect condition, the one that evaluates either as True or False when triggered from query (which is ok, our condition is therefore not static, but a dynamic, smart one) and the one that is always False (static, which is ok) when evaluated/checked inside MDX script for deployment.

If the set is evaluated in a query, then depending on where the measures are placed in that query, a different path will be chosen. The first condition will be True in most cases. Only in situations when there is some other measure, not a default one, placed in slicer or subselect it will also evaluate to False. How come? Remember that dynamic set is evaluated in context of slicer and subselect, so that first expression catches scenarios when we have something there. In case it is False, we will grab the current member. That means a measure in slicer or the first measure in subselect (since when you make a subselect, the first measure inside becomes a default one automatically for that subcube). And that is exactly what we need.

In other words, the first condition will evaluate as False and take that path only for MDX script evaluation AND for measure in slicer/subselect other than a default one. Two flies, one stroke!

In all other cases the first condition will evaluate as True which means we don’t have measures in slicer or subselect. If that is so, they might be on axis. We test Axis(1) first, then Axis(0). If by some chance there are no measures on either of axes (a query without measures was executed), default measure is implicitely in slicer and that is taken as currentmember.

If measures are on axis, they will be extracted carefully.

That’s basically the operating mode of this trojan.

The explaination above was corrected and enriched 8 hours after the initial post, after a revision and after my memory about all the tips and tricks concerning this issue was refreshed. The key parts that were changed are evaluation paths.

Now, typical dynamic set looks like this.

Create Dynamic Set CurrentCube.[Top 10 Products for slicer measure] As
        TopCount(
                 [Product].[Product].[Product].Members,
                 10,
                 Measures.CurrentMember
                )
      , Display_Folder = ‘My demo sets’;
 

It operates only on slicer and subselect, as we stated before.

However, this new dynamic set, an improved version of previous example, uses the first measure in trojan no matter where it is (as long as it is not on the same axis – it can be on opposite axis or slicer/subselect).

 Create Dynamic Set CurrentCube.[Top 10 Products for first measure] As
        TopCount(
                 [Product].[Product].[Product].Members,
                 10,
                 [Query measures evaluated].Item(0).Item(0)
                )
      , Display_Folder = ‘My demo sets’;
 

And finally, in order to test all this, you would need to add those definitions into your Adventure Works 2008 database (in the end of MDX script), and fire the following queries (or your variants of it).

SELECT
NON EMPTY
    {
       
[Measures].CurrentMember
    }
ON AXIS(0),
    NON EMPTY
    {
        [Top 10 Products for first measure]
    }
ON AXIS(1)
FROM
    [Adventure Works]

SELECT
NON EMPTY
    {
        [Measures].[Order Quantity],
        [Measures].[Reseller Sales Amount]
    }
ON AXIS(0),
    NON EMPTY
    {
        [Top 10 Products for first measure]
    }
ON AXIS(1)
FROM 
    [Adventure Works]

SELECT
NON EMPTY
    {
        [Measures].[Reseller Sales Amount],
        [Measures].[Order Quantity]
    }
ON AXIS(0),
    NON EMPTY
    {
        [Top 10 Products for first measure]
    }
ON AXIS(1)
FROM 
    [Adventure Works]

SELECT
NON EMPTY
    {
        [Measures].CurrentMember
    }
ON AXIS(0),
    NON EMPTY
    {
        [Top 10 Products for first measure]
    }
ON AXIS(1)
FROM
 
  (SELECT [Measures].[Order Quantity] ON 0 FROM [Adventure Works])

SELECT
NON EMPTY
    {
        [Date].[Calendar].[All Periods]
    }
ON AXIS(0),
    NON EMPTY
    {
        [Top 10 Products for first measure]
    }
ON AXIS(1)
FROM
    [Adventure Works]
WHERE
   
([Measures].[Order Quantity])

In case you have any doubts about results, feel free to make another query, the one where you’ll precisely write what do you want, so that you can verify whether Top 10 Products dynamic set calculates correctly or not.

Once you cast your fears aside, feel free to use it in any of your projects in case you need it.

Oh, one more thing. Excel 2007 and OWC are immune to it, they can not profit from this discovery. Only those OLAP clients/tools that generate clean MDX (no fancy stuff in subselect) and respect sets in general as a valid and powerful entity in OLAP (or let you at least edit the MDX if nothing else) are candidates for exploiting this trick. MDX Studio and CubePlayer are some of them known (and available) to me. Any information about other front-ends/tools?

More advanced MDX stuff coming soon :-).

Mar 312009
 

Universal calculated measures in MDX queries was the title of my session in SQLBits IV conference held in Manchester on 28th of March, 2009. Abstract of it follows.

Among many of its functions, MDX language has one special set function – Axis() function. That function allows creation of calculated measures that are fully context aware and, if wanted, don’t need to refer to any dimension or hierarchy in the cube. In other words, such measures are universal or independent, which means they can be used in any MDX query.
In this session we will present such measures and explain how they work. We’ll also show the way how to design them for various scenarios and discuss their potentials and weaknesses.

The presentation together with all the queries, SSAS projects (based on Adventure Works 2008 sample) and other materials can be downloaded here.

Since this topic of using Axis() function is much bigger than what can fit in an hour or so, my intention is to write a paper on this subject, similar to papers about utility (shell) dimension found here or many-to-many dimensional modeling found here, in order to present and explain all the findings in a proper manner, so that they can be used in various BI projects. The plan is to publish it here, in this blog by May 2009. Working title? Universal calculations in MDX.

Jan 192009
 

 

While doing some calculations for a client, I encountered a not so strange wish – to show an incremental number for each row of data. Although it seems pretty easy, I ended playing with it for several hours, because it had to be generic solution, the one that doesn’t depend on hierarchies placed on rows.

Finally I got it! Here it is.

 

WITH

MEMBER [Measures].[Count of Rows] AS
    Axis(1).Count

MEMBER [Measures].[Count of Hierarchies on Rows] AS
    Axis(1).Item(0).Count

MEMBER [Measures].[Row Number] AS
    Rank(
         StrToTuple(
                    "( " +
                     Generate( Head(Axis(1),
                                    [Measures].[Count of Hierarchies on Rows] ) AS L,
                              "Axis(1).Item(0).Item(" +
                               CStr(L.CurrentOrdinal – 1) +
                              ").Hierarchy.CurrentMember",
                              " , " ) +
                    " )"
                   ),
         Axis(1)
        )
    , FORMAT_STRING = "#,#"

SELECT
    {
        –[Measures].[Count of Rows],
        –[Measures].[Count of Hierarchies on Rows],
        [Measures].[Row Number]
    } ON 0,
    {
        [Date].[Calendar].[Calendar Year].MEMBERS *
        [Product].[Product Categories].[Category].MEMBERS
    } ON 1
FROM
    [Adventure Works]

 

It needs a bit more tweaking to become bulletproof, but I’ll leave that for some other occasion.

A remark for Excel users. Beware that Excel sometimes tricks you and keeps hierarchies on columns although they look like they are on rows. If you’re using Excel as an OLAP client, make sure you put 2 measures on columns, that will force hierarchies placed on rows to actually appear there.

Jan 182009
 

 

2008 was a very good year for me. I’ve been on maternity leave with my son, engaged in SSAS forum, went to London and Seattle and met some very nice people from all around the globe, virtually and in person.

I fulfilled all the goals I set for myself:

1) Become recognized as one of the people who are good in SSAS
2) Achieve and retain a top half position in SSAS forum
2) Spread the word about our product, CubePlayer
3) Pass my first MS exam (70-445)

Although there were some losses (my pension fund i.e., as everyone’s else I guess), the overall gain is much greater. And despite the current crisis, I believe 2009 will bring even more good things for me and the company I work for. It’s only up to us to reach for it!

Nov 112008
 
 
Hi everybody,
 
I’m preparing for a trip to Seattle where I’ll be attending PASS Community Summit 2008.
 
Besides gaining new knowledge from presenters and their sessions, I’m also anxious to have a word or two with some of the people I met on MSDN SSAS forum or anybody else who spend their time working with Analysis Services. If you happen to see me there (the picture is on my blog’s home page), be sure to say hello.
 
For anyone interested in seeing CubePlayer in action, I’ll be having plenty of USB sticks with its instalation/run-time and other materials as a free giveaway. Whether on your laptop or mine, whether in a crowded lobby or a more peaceful lounge, I’ll be glad to show you some of its features in a live, one-on-one demo.
 
See you …