Archive

Posts Tagged ‘Universe’

Dynamic Data Provider within BusinessObjects v4.0

December 19th, 2012 16 comments

As customers migrate from the classic Business Views one feature missing from the new Information Design Tool is a straight-forward dynamic data connection option when defining connections.  In fact, it’s already been mentioned in  Ideas Place along with a great use case.

Dynamic Data Connections

Dynamic Data Connections allow a database connection to be determined by the answer to a user prompt.  This capability is not often used but is extremely powerful for organizations that have exactly the same database schema across multiple instances.

Normally within the universe you can use table mappings to automatically point the user to the correct database, but in some cases the user must be able too select to desired database on the fly.

Once I began to look into this issue I realized that I could use the power of @prompt and table mapping to dynamically determine the name of the database to connection to.  This solution works extremely well.  The only restriction is that all of my databases through a single Universe connection, e.g. I can’t have my same database schema mapped across desperate databases, e.g Microsoft and Oracle.

My Databases

In my example I am using MS SQL Server 2008.

I setup two databases:  DB1 and DB2.  Both databases contained a single table called Customers.  The report consumer needs to dynamically switch between DB1 and DB2.

Defining The Connection

The first thing you need to do is define a connection.  It’s very important to define the connection using connection that points to the first of my two databases.  By default I don’t want to require a table owner and table qualifier to be defined.  In my case, I setup the connection to be called DDC_Connection and it points to my DB1 database.

The Data Foundation

This will be where the magic happens, but will get to that later.  We are going to add the Customers table to our Data Foundation.  This is where you will define all your standard table joins.  In our case my data foundation is called DDC_Foundation and it contains one table, Customers.

You will want to use Show Column Values and Profile Column Values to test your data foundation.

The Universe

Next we will want to setup our Universe as if it were only connecting to a single database.  Use the Data Foundation we’ve already defined to finish building your Universe.  In my case the Universe is called, DDC_Universe.

You will want to use the Show Values and Queries capability within IDT to test your Universe and make sure everything working properly.  You may also wish to test it with WebIntelligence as well.

The Magic!

Now that you’ve tested the Universe and it’s working with your default database, you want to make it dynamic by replacing the table names with an @prompt statement.  This statement will be interpreted by the reporting engine and replaced with the returned string.  In my case:

@prompt(‘Select DB:’,’K’,{‘DB1′,’DB2’},mono,constrained)
will be replaced with DB1 or DB2.

NOTE:  If you are unfamiliar with the options for @prompt, you may read about
them in the IDT Users Guide

I replaced the Customers table name with the @prompt command and the necessary table qualifier, .dbo.Customers

Therefore the resulting string you will be paste into the table name will be something like:

@prompt(‘Select DB:’,’K’,{‘DB1′,’DB2’},mono,constrained).dbo.Customers

Save the Data Foundation.

The Results

Open the Universe and have a look.  Everywhere you say the table name has now been replaced with the @prompt string.  Below is a screenshot of the resulting Region field.

If I now right-click and choose Show Values…

You will see the @prompt be evaluated at run time and you will see a prompt.

In my case I selected DB1, so the region Georgia appeared in the list.

The only challenge with this magic is that suddenly my universe can seem unnecessarily complex and difficult to maintain.  Therefore in order to keep the universe as simple as possible, we can use table mappings to only apply the prompts at run time to all my non-universe designer users.

Better Magic:  Using Table Mapping

If you use table mapping, then you leave your universe in the default state such that the data foundation does not contain an @prompt.

Next, select the IDT security editor from the top menu bar.

This will allow you to define a security profile and apply it to a set of users.  Typically you would want this rule to apply to all non-Universe Designer Users.

First select Users/Groups at the bottom of the screen so you can select the users you want to apply our Data Security profile to.

Select the Group of users you want to apply the table mapping to.  (I selected Everyone for testing.)   Next, select the universe you want to apply the table mapping to.  In this example I selected DDC_Universe.unx.

Next, you will click to icon to create a new data security profile.  I recommend you rename to profile to clearly indicate what it is used for, e.g. Dynamic DB Connection.

Click on the tables tab and Insert to create a new table replacement

You will be replacing your table names with the @prompt/table name combination values.  The table mapping wants to put quotes around the replacement strings, therefore you must specify the Qualifier, Owner and Table separately.

Here is what it should look like after you have entered in the values.

http://trustedbi.com/images/blog/BO40/dynamic/ddc_replacementtable.png

The final step will be to activate this newly created security profile to the group and universe that you’ve selected.  Simply check the box next to the security profile.

Experience The Magic

After everything has been setup, the WebIntelligence reporting engine will apply the security profile when a matching user runs a reporting using a secured universe (namely a security profile has been applied).

Here is what the user will see in WebIntelligence.   They will select the universe, choose the objects they want for the query and choose run.

When the user selects Run Query, they will receive a prompt to select to database.  This is because the security profile defined on this universe for this user requires the security profile be applied.

After the user selects the database,  they will see the results based on the database they chose.

Extra Credit

In this example I shared here, I hard-coded the list of databases in the @prompt selection.  If you have a large number of universes, you can create a table in your our database which lists all the databases.  In the case of one customer with hundreds of distinct databases, they created a column called DB Name and made it part of the universe for reporting purposes.  They then referred to it in the query as:

@prompt(‘Select Data:’,’K’,’DB VersionDB Name’,mono,constrained).dbo.tablename

Troubleshooting

If you use the wrong @prompt parameter, namely ‘A’, quotes will be returned surrounding the string and this will cause the syntax to be wrong and you may see a message like this:

Database error: Incorrect syntax near (IES 10901) (WIS 10901)

There were some known issues with the @prompt in earlier releases of  BI 4.0.

This is a known issue fixed in 4.0 SP4 Patch 6

The Semantic Layer Amazes

I’m sorry but I have to say it.  The breath and depth of the semantic layer never ceases to amaze me.  This is a perfect example.  When Business Objects first come up with the idea of a semantic layer, they got it right… and most of the features like table mapping have been there for years!

They say that 80% of customers only use 20% of the functionality that exists in a product.  I’m sure glad the other 80% is there when I need it… and all the pieces fall into place.

«Good BI»

 

 

Supercharge Your Universe with Time-Series Analysis – Part 3

January 18th, 2010 10 comments

The Semantic Layer allows for powerful analysis of any relational database based on time.   The ability for administrators to create complex time-based measures and filters, means that business users can access the data they need to answer any adhoc questions.  We covered these topics in more detail in the previous two posts.

In the past, organizations took weeks to answer such questions or instead these questions went unanswered because the reports took too long to build.  That is no longer true.

Easiest of All

I use Quicken as part of tracking personal finance.  Last Year, Last Quarter, Last Month are common date ranges that I like to use over and over, unfortunately BusinessObjects doesn’t provide this out of the box… but with a little creativity, you can build it yourself.  It’s quite simple really and when you see how simple it is, you will be shocked.

Receipe for Success

The Recipe for Simplicity

The first thing we need to create is a database view with calculated dates which will change automatically from one day to the next.  The table will contain three columns:

  1. Name of the Range
  2. Start Date
  3. End Date

With these three columns we can build any date we wish.  Here are what the results look like when you view the table:

Results from Date Range

Note that each day, based on the associated view, these dates will change.

Here an excerpt of the logic in SQL Server to create the view:

SELECT 'Last 7 Days' AS Date_Range, CONVERT(smalldatetime,{fn curDATE()})-6
AS Begin_Date, CONVERT(smalldatetime, { fn curDATE() }) AS End_Date
FROM dbo.syscolumns
UNION
SELECT 'Today' AS Date_Range, CONVERT(smalldatetime,{fn curDATE()})
AS Begin_Date, CONVERT(smalldatetime, { fn curDATE() }) AS End_Date
FROM dbo.syscolumns
UNION ... etc.

Download the Full SQL Statement HERE.

Isn’t that AMAZING!?!  Now all you need to do is create a BETWEEN-join from the date field on your fact table and the start and end dates within the Date Range Table.  Here is the logic:
dbo.Current_Facts.sales_date between dbo.Date_Ranges.Begin_Date and dbo.Date_Ranges.End_Date

It should look something like this:

Date Range IS BETWEEN Fact Table

Now that I have this dynamic date range defined within my universe I can use it two ways:

  1. I can use it simplify the traditional prompts that typically include a  ‘start date’ and ‘end date’ field.  BusinessObjects provides optional parameters, so the report could allow the user choose which type of prompt they wish to use.
  2. Alternatively, I can use these fields for scheduled reports.  Users can specify “Yesterday” in the  prompt so that each day when the report runs, the prompt is automatically updated.

Additional Downloads

These series was inspired by some incredible work that was done by Richard Reynolds while he was working at SAP BusinessObjects.  He has an amazing way of taking basic principles around technology applying them is ways that are so simple.  The idea of the Begin/End Date connected to the fact table was his genius.  Having the chance to work with Richard was a true highlight to my career at SAP BusinessObjects and I wish him the best in all his future ventures.

If you would like to download the entire Universe, which contains fabulous examples of what I’ve been showing during this three-part series you can download it here.

Download the Foodmart Universe (Foodmart.unv)
Download the Foodmart Database (SQL Server 2005 BAK File or MDF/LDF Files)

Enjoy,

«Good BI»

Supercharge Your Universe with Time-Series Analysis – Part 2

January 6th, 2010 4 comments

Using time is one of the most important ways to analyze information.  Each day, week, month and year is unique and can best be understood within it’s specific context.  In my last post, we showed how you can use the Semantic Layer to create time-based filters.  Today we are going to discuss Time Specific Measures.

Creating Time Specific Measures

Time Specific Measures are important because they allow us additional flexibility within WebIntelligence.  Instead of creating two queries, one for last years sales and one for this years sales, we can create one query and then use the definition within the measure objects to filter the records we need.  The following examples are for SQL Server:

YTD Sales

sum(case when datepart(yyyy,dbo.Current_Facts.sales_date) =
 datepart(yyyy,getdate()) and dbo.Current_Facts.sales_date<=
 getdate() then dbo.Current_Facts.store_sales else 0 end)

Previous YTD

sum(case when datepart(yyyy,dbo.Current_Facts.sales_date) =
 datepart(yyyy,getdate())-1 and datepart(dy,dbo.Current_Facts.sales_date) <=
 datepart(dy,getdate()) then dbo.Current_Facts.store_sales else 0 end)

Now I can run a report and place these two measures on the report and the measures values for Current YTD and Previous YTD are already filtered accordingly.  What’s great about these measures is that they can be exceedingly complex within the universe, but they are easy for users to implement.  They don’t need to worry about accidentally filtering out too many records because the filtering within the case statement is non-exclusive.

Try this one on for size, Year to Date Sales v. Previous Year to Date Sales as a percent.

YTD Sales vs Prev YTD Sales %

case when sum(case when datepart(yyyy,dbo.Current_Facts.sales_date) =
 datepart(yyyy,getdate())-1 and datepart(dy,dbo.Current_Facts.sales_date) <=
 datepart(dy,getdate()) then dbo.Current_Facts.store_sales else 0 end)=0 then 0
else (sum(case when datepart(yyyy,dbo.Current_Facts.sales_date) =
  datepart(yyyy,getdate()) and dbo.Current_Facts.sales_date<= getdate()
then dbo.Current_Facts.store_sales else 0 end)/sum(case when
 datepart(yyyy,dbo.Current_Facts.sales_date)=datepart(yyyy,getdate())-1 and
 datepart(dy,dbo.Current_Facts.sales_date)<=datepart(dy,getdate())
then dbo.Current_Facts.store_sales else 0 end))-1 end

Now that’s some complicated SQL, and yet, because of the power of the semantic layer, all the end user consumer has to do is to drag the object onto the report.  The database will do the rest.

You can do the same on a monthly level.  Here is a list of time-based measures that I use all the time.

Time-based Measures

As you can imagine, it’s a great way to provide options to your users that they would never have dreamed possible before.  In my final post, part 3, I will provide a copy of the universe for you to download.

«Good BI»

Supercharge Your Universe with Time-Series Analysis – Part 1

December 29th, 2009 1 comment

To some degree or another, as we close out the year, we will all reflect on 2009 and what has been accomplished.  There are always things we might have done better, but looking at things over the time helps to put them in perspective– both in life and in business.    This is the first in a three part series.

Time Puts Everything in Perspective

When it comes to running reports and looking at your data are you leveraging time like you should?  Most organizations understand their performance month to month or year over year, but how much flexibility do you give your analysts to look at the information within the time segments they choose?  Some common time-series are:  YTD (Year to Date), MTD (Month to Date), WTD (Week to Date), but there is so much more.  As we prepare to close out 2009 and welcome in 2010, I challenge you to see if you are really using the power of time in your BI solution.

Insight Through Time

It’s important to always analyze your figures in comparison to other equivalent times, e.g. YTD with PYTD (Previous YTD).  The same with MTD and WTD, but how do you do it?  Within the Universe you can create filters.  These filters can be applied to any query.  Here is the SQL required.

YTD (Year to Date)

datepart(yyyy,dbo.Current_Facts.sales_date)=datepart(yyyy,getdate()) and
dbo.Current_Facts.sales_date<= getdate()

PYTD (Previous Year to Date)

datepart(yyyy,dbo.Current_Facts.sales_date)=datepart(yyyy,getdate())-1 and
datepart(dy,dbo.Current_Facts.sales_date)<=datepart(dy,getdate())

Because you can run multiple queries using BusinessObjects you can simply create a query with the YTD filter and then “duplicate” the query and replace the YTD filter with the PYTD filter.  The dimensions of the two queries will be merged automatically.  Here is the query panel:

And here are the results:

… but sometimes you don’t want to have to make two queries.  Why not just get all results for the last two years and split things up within the measure?  Well, the power of the semantic layer allows you to do that too.  I’ll explain more about that in my next post, but before I do, let me share some additional common filters you should consider using:

Last 7 Days

dbo.Current_Facts.sales_date between convert (SMALLDATETIME,
{fn CURDATE()})-7 and convert (SMALLDATETIME, {fn CURDATE()})

Last n Days

dbo.Current_Facts.sales_date between getdate()-
@Prompt('Days Ago','N',,mono,free) and getdate()

Last n Months

datediff(m, dbo.Current_Facts.sales_date,getdate()) <=
@Prompt('Months Ago','N',,mono,free) and dbo.Current_Facts.sales_date<=getdate()

Last n Months Ago Ending When

datediff(m, dbo.Current_Facts.sales_date,@Prompt('Ending Date','D',,mono,free))
<= @Prompt('Months Ago','N',,mono,free) and
dbo.Current_Facts.sales_date<=@Prompt('Ending Date','D',,mono,free)

I will post a copy of my everything-you-never-thought-you-could-do-in-a-universe universe at the end of this series, but here is a list of some of the filters that exist in my universe.

Time-based filters Graphic

Common Time-base Filters

That’s a long list, eh?  I almost wish all these filters were already in the universe by default, but they aren’t, so don’t be afraid to add as many as you need.  Remember, when you add a filter to a query it affects the entire query.  Tomorrow we’ll talk about the technique of creating time specific measures…

«Good BI»

Sets Integrated into the Semantic Layer

August 18th, 2009 6 comments

On August 5, 2009, the SAP BusinessObjects Innovation Center announced a prototype for moving Sets Technology directly into the Semantic Layer.

Potential Future of Sets

Since the EOL announcement of BusinessObjects Performance Manager and two of the three analytic engines, Process Analysis and Predictive Analysis, there has been an ongoing question on the future of Set Analysis.  I have long been an advocate of the incredible power and importants of sets to help customers get a better understand of how data is changing under the covers.  If you have 100 more customers, that’s great… but how many did you lose and how many did you gain?  You may have lost 50 and actually added 150.  What about my “Gold” Customers?  How many new Gold Customers do I have?  Where did they go?  Did they become Silver members?

We have many customers who have successfully implemented Set Analysis; however the solution had not had significant R&D investment for some time and was tied to the legacy metrics engine of Performance Manager.  How could this technology be integrated into the new XI 3.X platform?

Sets and the Semantic Layer – I’m in Heaven

The Innovation Center has delivered a prototype which show Sets integrated directly into the Semantic Layer.  This prototype is a plug-in to the universe that allows you to create these “sets” , static or dynamic – all in the query panel.  Semantic layer sets lets customers leverage the complexity of  sets natively within that environment.  This is a huge step forward.

Check out the 15 minute video on the SDN website

You can see that this implementation can support all the classic calendar sets which will support of joiners, leavers, stayers, which are at the core of Set Analysis.  In this labs implementation there is support for both Visual Data and Visual Sets; however I hope this is expanded in the future to support Freehand SQL and an imported list.

I think this is a fantastic new innovation and needs to be added to the core product ASAP.  If you think so too, let your voice be heard:  http://sets.uservoice.com

I’ll keep you up-to-date with additional innovations around sets and may provide some additional posts on this topic if there’s interest.  Let me know.

«Good BI»