Archive

Posts Tagged ‘Semantic Layer’

SAP Discusses Latest BI Roadmap

August 15th, 2013 3 comments

#AllAccessAnalytics

Yesterday, many SAP BusinessObjects customers, mentors, partners and even a few competitors listened in as Steve Lucas as he unveiled SAP’s bold, new analytics strategy.  The twittersphere as active as tweets were tracked using #allaccessanalytics, we had a lot of fun.  Zimkhita Buwa quipped:

Steve emphasized that SAP isn’t delivering a ‘mission accomplished’ banner.  It’s just a ‘mission’ banner.  “Where we are and where we are going.”  There is a bunch still to come…  One of the first bold things he did announce was that the personal edition of SAP Lumira is free and you can download it now.  In addition you can also register for free access to the SAP Lumira Cloud.  You can register for free here.

New Mindset

It was clear to hear Steve’s passion for analytics.  His passion goes way, way back… and is always welcome.  In fact, I think this is an old picture of Steve’s car back in 2003.

He talked about BusinessObjects founder Bernard Liautaud and the rich legacy from which this SAP BusinessObjects BI Suite comes.  Steve took a note from Visha Sikka to say that SAP is and will innovate and that the Innovator’s Dilemma is crap.  Customers will continue to see great new innovations from SAP.

The new mindset from SAP delivers three pillars:

  • Enterprise Business Intelligence – For the Entire Organization
  • Agile Visualization – For the Business
  • Advanced Analytics – Data Science for Everyone

Enterprise Business Intelligence

With 60,000 customers, SAP continues to have to largest market share within the Business Intelligence space.  SAP BusinessObjects didn’t invent Business Intelligence but our experience in the space is very rich… and if not SAP BusinessObjects then who?  SAP is going to continue to build out capabilities for the Enterprise organization.  This is squarely focused on our SAP BusinessObjects product line.

At the same time the market has shifted and everyday business users want to be able to connect to their data quickly and easily and get new insights and share those insights with colleagues… and perhaps not have IT involved at all.  That brings us to agile visualization.

Agile Visualization

In SAP’s mind, Agile means it’s incredibly easy to adopt and deploy.  It should be light-weight.  Visualization means that its high quality and there is storytelling behind it.  The visualization tells a story.  It gives new insights.

So is this just Lumira?

No.  Steve was clear about this.  This is not just Lumira.  Lumira and Lumira cloud are a part of the agile visualization strategy, but the agile visualization strategy.  He promised that more would be shared at TechEd, etc.  I think this is great news!  We are beginning to see integration today between Lumira Cloud and on-premise systems and it sounds like this will continue.

Advanced Analytics

Steve said that this is more than just predictive.  As with Lumira, Predictive is a part, but there’s more.  SAP’s view is that this is not just for data scientists but data science for everyone.  It sounds we might begin seeing more ‘smart’ functionality build into the analytics.  We’ve already seen predictive leveraging the interface of Lumira for easy data access.  I can definitely envision a lot of possibilities here.

Key SAP Executives

Michael Reh (@reh_michael) – is leading development.  Passionate about analytics.

Christian Rodatus (@crodatus)  – Go to market executive for analytics.  18 years at Teradata.  Brings big data perspective.

Shekhar Iyer (@shekharlyer111) is leading the BI organization.  Brings a predictive perspective.

Jack Miller (@jackmillerIII) is Chief Customer Officer – in charge of generating successful, happy customers.

Jayne Landry (@jaynelandry) is Crystal Kiwi, working closely with Shekhar.

Other Highlights

SAP BusinessObjects BI Suite 4.1 is out of ramp-up.  All the KPIs were hit, so I would expect a GA release soon with the release of SP1.

There was a great video of a WebIntelligence-like product that was running on top of HANA.  It was written completely in HTML5.  I’ve never seen this before.

Screenshot of HTML5 WebI-like Tool

There was another nice video of Lumira doing geospatial analysis using the new Lumira Visualization Extensions which were released with SP11.  Timo Elliott recently did a nice blog post talking about this topic.

Lumira with Geospatial Capabilities

Lumira with Geospatial Plug-in

On September 9th, SAP is planning to launch a new BI online support site.  It looks as if they are following in the footsteps of the HANA launch site.  They briefly showed a mock-up of what it might look like.

Magic Bus

Steve revealed the new bus.  Yes, it’s a literal Big Data Bus.  SAP will be rolling out a mobile briefing center that will be used to showcase SAP’s latest and greatest. I think it’s one of SAP’s ways of saying there’s  50 ways to leave your… niche BI tools.   So hop on the bus Gus!

I couldn’t resist.

My Thoughts…

If you are an SAP customer and haven’t yet purchased SAP BusinessObjects, there is no better time than now.  The integration between SAP and the BusinessObjects BI Suite is second to none.  Here are a list of just a few of the unique advantages you can leverage when reporting against SAP ERP and SAP BW with SAP BusinessObjects:

  • Support for SAP CTS+
  • Integration with Solution Manager
  • Support for RRI (Report to Report Interface)
  • Support for BICS (3x faster than legacy BAPI interface)
  • Best Heterogeneous data source support
  • Best Slice/dice performance within MS Excel
  • Embedded Analytics within SAP Business Suite EP6
  • Crystal Reports options for Advanced List Viewer (ALV)
  • Semantic layer support for Infosets, ABAP functions and ABAP Queries
  • 100% In-memory support for all your SAP data

If your organization is committed to SAP Business Suite, then leveraging SAP BusinessObjects to provide reporting off those solutions is a no brainer.

Secondly, have a look at Predictive Analysis.  Although this product is relatively new, SAP has come a long way very quickly.  SAP has combined the core self-service Lumira (Visual Intelligence) product together with the power of R to deliver world class predictive analytics to the data analyst.  The interface is extremely easy to use and if you haven’t seen it, check out the post I did where I provided a product walk-through.  It may not necessarily replace SAS today, but it can deliver tremendous value by shortening the length of time it takes data analysts to build, model and run predictive algorithms.  Users are no longer wholly dependent on the small number of statisticians to provide predictive  and statistical analysis.  Predictive Analysis is a game changer.

Thirdly, get familiar with SAP’s simplified licensing.  Back in the day, when BusinessObjects was just one product, licensing was easy. Over the years as the BusinessObjects BI portfolio has grown, not everyone was ready to leverage the new technologies such as WebIntelligence, Dashboards, Explorer, etc.  As a result, BusinessObjects allowed customers to buy products à la carte to keep the pricing competitive.   A lot has changed.  Today, Business Intelligence is ubiquitous.  Everyone needs it and organizations should want to leverage the same solution for multiple types of users who have different analytic needs.  Back when the only product was had was Crystal Reports, I used to show how Crystal Reports provided enterprise reporting, adhoc reporting and dashboards. SAP’s approach was to simplify this licensing through bundles.  At the beginning of 2013, SAP offered BI Suite licensing which provided two important changes:  concurrent user licensing and a powerful software bundle of nearly every product in the SAP BusinessObjects Business Intelligence Product Suite.

Conclusion

This #allaccesswebinar didn’t answer all our questions but one thing was clear:  SAP is fully committed to an easy-to-adopt analytics product suite for all users that serves the enterprise through both on-premise and cloud.  They are committed to delivering solutions that: compete head-to-head against the newcomers, deliver customer value and are agile and easy to adopt and use.

If you want more information on the latest published roadmaps from SAP, go here.

Now… hop on the bus Gus!

«Good BI»

Dealing with Small Numbers in Explorer

January 2nd, 2013 1 comment

Recently I’ve been spending a lot of time with Explorer and I’ve made a few interesting discoveries that I wanted to pass along.

I’m not sure why some of my BI colleagues haven’t been giving Explorer any love (see the DSLayer Explorer Gets No Love podcast), but I think it’s a phenomenal product… but with any product there are always little quirks to deal with.  Some people call them bugs… I call them quirks.  🙂

Truncation Frustration

I spent a good part of a day experiencing truncation frustration.

I was trying to determine how to display a capacitance value within Explorer.  Capacitance as you may know can be a extremely large or an extremely small number.  It took me multiple attempts to figure out how to get it to work but I finally cracked it!

What Didn’t Work

Capacitance was stored in the database as a float, which worked great when I used it as a measure — but when I displayed it as a facet value, suddenly Explorer began to get it wrong.  Here are both the facet and the measure for capacitance displayed side by side.  It seemed only able to display up to three decimal places.

See truncated values on the left and the correct measure value on the right.

Here is how I initially had the capacitance defined as a float.

Normally the first thing to do with numbers is leverage the format value within the semantic layer and I tried everything but somehow, if the capacitance was stored as a number, Explorer would consistently drop the significant values to the right of the decimal place.  Here is how I defined the custom format for the capacitance.  I tried both # and 0’s but neither worked.

The IDT appears to show the number correctly when I preview it…

… and yet it still appeared incorrectly within Explorer.  What was going on?  At this point it’s pretty clear that this bug quirk should be fixed, but I needed to get it working asap.  I needed a work around.

I know I was going to have to convert the capacitance to a string.  The added benefit of this was that now capacitance would also be full searchable as a string.

I tried multiple formulas to try and do the conversion to a string.

None of these worked successfully:

  • cast (Capacitors.Capacitance as varchar(18)) returns scientific notation – Yuck.
  • Str (Capacitors.Capacitance,10,8)
  • charindex([^0],reverse(Str (Capacitors.Capacitance,10,8)))

What Did Work

The problem now was that regardless of how many decimals of precision the capacitance had, there were always 8 trailing zeros and I desperately wanted to get rid of these, so finally I found the magic formula:

CASE WHEN PATINDEX(‘%[1-9]%’, REVERSE(Str (Capacitors.Capacitance,10,8))) < PATINDEX(‘%.%’, REVERSE(Str (Capacitors.Capacitance,10,8))) THEN LEFT(Str (Capacitors.Capacitance,10,8), LEN(Str (Capacitors.Capacitance,10,8)) – PATINDEX(‘%[1-9]%’, REVERSE(Str (Capacitors.Capacitance,10,8))) + 1) ELSE LEFT(Str (Capacitors.Capacitance,10,8), LEN(Str (Capacitors.Capacitance,10,8)) – PATINDEX(‘%.%’, REVERSE(Str (Capacitors.Capacitance,10,8)))) END

Special Thanks to SwePeso

After beating my head against a wall, the sense of achievement and satisfaction were extremely rewarding and reminded me of why I love my job so much.  I got what I wanted.

It was great.  The user can search for 0027 capacitance and will get the appropriate match (more on that next week).  Also, you can also observe that all the capacitance values show up in the correct sort order, which means when I display capacitance is order from smallest to largest they are sorted in the correct order.

Capacitance Graph displays properly

Conclusion

As Explorer continues to mature, it’s my hope that more and more of these quirks will be addressed by the product team and more of the product will work as expected — becoming a fully fledged, first-class citizen of the semantic layer.

I’d also would like to see hyperlink support within Explorer.  I think this is long overdue.  Please help me vote YES, thumbs up for this feature >> http://bit.ly/SzSSo0

«Good BI»

 

Dynamic Data Provider within BusinessObjects v4.0

December 19th, 2012 16 comments

As customers migrate from the classic Business Views one feature missing from the new Information Design Tool is a straight-forward dynamic data connection option when defining connections.  In fact, it’s already been mentioned in  Ideas Place along with a great use case.

Dynamic Data Connections

Dynamic Data Connections allow a database connection to be determined by the answer to a user prompt.  This capability is not often used but is extremely powerful for organizations that have exactly the same database schema across multiple instances.

Normally within the universe you can use table mappings to automatically point the user to the correct database, but in some cases the user must be able too select to desired database on the fly.

Once I began to look into this issue I realized that I could use the power of @prompt and table mapping to dynamically determine the name of the database to connection to.  This solution works extremely well.  The only restriction is that all of my databases through a single Universe connection, e.g. I can’t have my same database schema mapped across desperate databases, e.g Microsoft and Oracle.

My Databases

In my example I am using MS SQL Server 2008.

I setup two databases:  DB1 and DB2.  Both databases contained a single table called Customers.  The report consumer needs to dynamically switch between DB1 and DB2.

Defining The Connection

The first thing you need to do is define a connection.  It’s very important to define the connection using connection that points to the first of my two databases.  By default I don’t want to require a table owner and table qualifier to be defined.  In my case, I setup the connection to be called DDC_Connection and it points to my DB1 database.

The Data Foundation

This will be where the magic happens, but will get to that later.  We are going to add the Customers table to our Data Foundation.  This is where you will define all your standard table joins.  In our case my data foundation is called DDC_Foundation and it contains one table, Customers.

You will want to use Show Column Values and Profile Column Values to test your data foundation.

The Universe

Next we will want to setup our Universe as if it were only connecting to a single database.  Use the Data Foundation we’ve already defined to finish building your Universe.  In my case the Universe is called, DDC_Universe.

You will want to use the Show Values and Queries capability within IDT to test your Universe and make sure everything working properly.  You may also wish to test it with WebIntelligence as well.

The Magic!

Now that you’ve tested the Universe and it’s working with your default database, you want to make it dynamic by replacing the table names with an @prompt statement.  This statement will be interpreted by the reporting engine and replaced with the returned string.  In my case:

@prompt(‘Select DB:’,’K’,{‘DB1′,’DB2’},mono,constrained)
will be replaced with DB1 or DB2.

NOTE:  If you are unfamiliar with the options for @prompt, you may read about
them in the IDT Users Guide

I replaced the Customers table name with the @prompt command and the necessary table qualifier, .dbo.Customers

Therefore the resulting string you will be paste into the table name will be something like:

@prompt(‘Select DB:’,’K’,{‘DB1′,’DB2’},mono,constrained).dbo.Customers

Save the Data Foundation.

The Results

Open the Universe and have a look.  Everywhere you say the table name has now been replaced with the @prompt string.  Below is a screenshot of the resulting Region field.

If I now right-click and choose Show Values…

You will see the @prompt be evaluated at run time and you will see a prompt.

In my case I selected DB1, so the region Georgia appeared in the list.

The only challenge with this magic is that suddenly my universe can seem unnecessarily complex and difficult to maintain.  Therefore in order to keep the universe as simple as possible, we can use table mappings to only apply the prompts at run time to all my non-universe designer users.

Better Magic:  Using Table Mapping

If you use table mapping, then you leave your universe in the default state such that the data foundation does not contain an @prompt.

Next, select the IDT security editor from the top menu bar.

This will allow you to define a security profile and apply it to a set of users.  Typically you would want this rule to apply to all non-Universe Designer Users.

First select Users/Groups at the bottom of the screen so you can select the users you want to apply our Data Security profile to.

Select the Group of users you want to apply the table mapping to.  (I selected Everyone for testing.)   Next, select the universe you want to apply the table mapping to.  In this example I selected DDC_Universe.unx.

Next, you will click to icon to create a new data security profile.  I recommend you rename to profile to clearly indicate what it is used for, e.g. Dynamic DB Connection.

Click on the tables tab and Insert to create a new table replacement

You will be replacing your table names with the @prompt/table name combination values.  The table mapping wants to put quotes around the replacement strings, therefore you must specify the Qualifier, Owner and Table separately.

Here is what it should look like after you have entered in the values.

http://trustedbi.com/images/blog/BO40/dynamic/ddc_replacementtable.png

The final step will be to activate this newly created security profile to the group and universe that you’ve selected.  Simply check the box next to the security profile.

Experience The Magic

After everything has been setup, the WebIntelligence reporting engine will apply the security profile when a matching user runs a reporting using a secured universe (namely a security profile has been applied).

Here is what the user will see in WebIntelligence.   They will select the universe, choose the objects they want for the query and choose run.

When the user selects Run Query, they will receive a prompt to select to database.  This is because the security profile defined on this universe for this user requires the security profile be applied.

After the user selects the database,  they will see the results based on the database they chose.

Extra Credit

In this example I shared here, I hard-coded the list of databases in the @prompt selection.  If you have a large number of universes, you can create a table in your our database which lists all the databases.  In the case of one customer with hundreds of distinct databases, they created a column called DB Name and made it part of the universe for reporting purposes.  They then referred to it in the query as:

@prompt(‘Select Data:’,’K’,’DB VersionDB Name’,mono,constrained).dbo.tablename

Troubleshooting

If you use the wrong @prompt parameter, namely ‘A’, quotes will be returned surrounding the string and this will cause the syntax to be wrong and you may see a message like this:

Database error: Incorrect syntax near (IES 10901) (WIS 10901)

There were some known issues with the @prompt in earlier releases of  BI 4.0.

This is a known issue fixed in 4.0 SP4 Patch 6

The Semantic Layer Amazes

I’m sorry but I have to say it.  The breath and depth of the semantic layer never ceases to amaze me.  This is a perfect example.  When Business Objects first come up with the idea of a semantic layer, they got it right… and most of the features like table mapping have been there for years!

They say that 80% of customers only use 20% of the functionality that exists in a product.  I’m sure glad the other 80% is there when I need it… and all the pieces fall into place.

«Good BI»

 

 

Date Conversion Made Easy…

October 15th, 2010 3 comments

Have you ever been using SAP BusinessObjects WebIntelligence and wanted to turn a prompt from a date time into a date?

I do a lot of demos using Microsoft SQL Server and by default Microsoft SQL Server saves everything as a date & time.  So what if I don’t want the time?  Simply convert the datetime into a date within the semantic layer.

Ignoring the time 12:00:00

There are many solutions, including many which rely on some form of string parsing, but why make it more complicated than it has to be?  This is what I like to use:

CONVERT(DATETIME, CONVERT(INT, GETDATE()))

This is the perfect solution if all the data is stored as 1/1/2010 12:00:00, wherein the time element is 0.

Ignoring time and rounding down

If indeed there is a time and it’s important to round down, e.g. 1/1/2010 11:59pm should be rounded down to 1/1/2010, then in this case I use:

CONVERT(DATETIME, FLOOR(CONVERT(FLOAT, GETDATE())))

This is great when accessing call center data and I need to group calls around a specific day, but the time is still extremely relevant.

Anyone else have any commonly used tips or tricks within the semantic layer?  Post it in the comments below!

«Good BI»

Supercharge Your Universe with Time-Series Analysis – Part 3

January 18th, 2010 10 comments

The Semantic Layer allows for powerful analysis of any relational database based on time.   The ability for administrators to create complex time-based measures and filters, means that business users can access the data they need to answer any adhoc questions.  We covered these topics in more detail in the previous two posts.

In the past, organizations took weeks to answer such questions or instead these questions went unanswered because the reports took too long to build.  That is no longer true.

Easiest of All

I use Quicken as part of tracking personal finance.  Last Year, Last Quarter, Last Month are common date ranges that I like to use over and over, unfortunately BusinessObjects doesn’t provide this out of the box… but with a little creativity, you can build it yourself.  It’s quite simple really and when you see how simple it is, you will be shocked.

Receipe for Success

The Recipe for Simplicity

The first thing we need to create is a database view with calculated dates which will change automatically from one day to the next.  The table will contain three columns:

  1. Name of the Range
  2. Start Date
  3. End Date

With these three columns we can build any date we wish.  Here are what the results look like when you view the table:

Results from Date Range

Note that each day, based on the associated view, these dates will change.

Here an excerpt of the logic in SQL Server to create the view:

SELECT 'Last 7 Days' AS Date_Range, CONVERT(smalldatetime,{fn curDATE()})-6
AS Begin_Date, CONVERT(smalldatetime, { fn curDATE() }) AS End_Date
FROM dbo.syscolumns
UNION
SELECT 'Today' AS Date_Range, CONVERT(smalldatetime,{fn curDATE()})
AS Begin_Date, CONVERT(smalldatetime, { fn curDATE() }) AS End_Date
FROM dbo.syscolumns
UNION ... etc.

Download the Full SQL Statement HERE.

Isn’t that AMAZING!?!  Now all you need to do is create a BETWEEN-join from the date field on your fact table and the start and end dates within the Date Range Table.  Here is the logic:
dbo.Current_Facts.sales_date between dbo.Date_Ranges.Begin_Date and dbo.Date_Ranges.End_Date

It should look something like this:

Date Range IS BETWEEN Fact Table

Now that I have this dynamic date range defined within my universe I can use it two ways:

  1. I can use it simplify the traditional prompts that typically include a  ‘start date’ and ‘end date’ field.  BusinessObjects provides optional parameters, so the report could allow the user choose which type of prompt they wish to use.
  2. Alternatively, I can use these fields for scheduled reports.  Users can specify “Yesterday” in the  prompt so that each day when the report runs, the prompt is automatically updated.

Additional Downloads

These series was inspired by some incredible work that was done by Richard Reynolds while he was working at SAP BusinessObjects.  He has an amazing way of taking basic principles around technology applying them is ways that are so simple.  The idea of the Begin/End Date connected to the fact table was his genius.  Having the chance to work with Richard was a true highlight to my career at SAP BusinessObjects and I wish him the best in all his future ventures.

If you would like to download the entire Universe, which contains fabulous examples of what I’ve been showing during this three-part series you can download it here.

Download the Foodmart Universe (Foodmart.unv)
Download the Foodmart Database (SQL Server 2005 BAK File or MDF/LDF Files)

Enjoy,

«Good BI»