Archive

Posts Tagged ‘BusinessObjects 4.0’

Lumira 1.29 – Data Blending Comes of Age

November 23rd, 2015 1 comment

Lumira recently introduced a new concept called BlendingBlending allows Analytics to allow data from multiple datasets or fact tables to be displayed across a common dimension.  This was a critical next step in Lumira’s evolution because often the job of an Analyst requires them to pull together multiple result sets into a single visualization.  It’s great to be able to visualize actual vs. budget together in a single graph; however often you want to be able to create a new calculation across the datasets, e.g. variance,  variance percent, etc.  This was not possible before.

Blending Enhancements

On Friday I downloaded the latest release of Lumira — Lumira 1.29.  There is a lot here to see and although it’s always great to see more features coming into the product, the enhancements to blending really make this release standout and make this version of Lumira a must-have upgrade.

Lumira has always done a good job at allowing users to manipulate data, but limitations of no micro-cube and the inability to create custom calculations across multiple data sets was extremely limiting.  Until now the only options were to:

  • Perform data preparation in the microcube of WebIntelligence and access to microcube directly using the APOS Data Gateway
  • Hope you could leverage the Join_By_SQL option within the Universe so that only a single result set would be returned.

As you might imagine, this was quite limiting and I ran into this first hand during a recent customer evaluation.  Here was my scenario.

Supporting YoY Comparisons

I was asked to calculate the growth factor within a dataset that included data over time.  The growth factor was defined as

G = (Volumecy/ Volume(cy-t))1/t – 1

Where t is the time in years. For the purpose of this analysis, t is taken to equal 3, 5 or 10 years.  cy is the current year.  The example below will use the standard year breakdown.

Example:  The table below is used in the example calculations

 2014 Volume 2013 Volume 2012 Volume 2011 Volume 2010 Volume
125,452 124,118 120,506 119,987 126,623

In 2014, the 3 year growth would be:

G = (125,452 / 119,987)1/3 – 1 = 1.5 %

In 2013, the 3 year growth would be:

G = (124,118 / 126,623)1/3 – 1 = -0.66 %

At first what seems so simple, wasn’t.

This is a straight-forward request.  With Lumira it wasn’t so simple.  When processing a Lumira document, the formula engine processes one record at a time, so prior to blending, the only option was to create custom calculations to filter the volume for each year.  I would then have to use these custom calculations in yet another calculation for the growth.  Here is a sample.

2014 Volume Custom Calculation:   if {Year} = 2014 then {TransCount} else 0

2011 Volume Custom Calculation:   if {Year} = 2011 then {TransCount} else 0

2014 3yr Growth:  Power(({Transcount_2014} / {Transcount_2011}), 1 / 3) – 1

So for 5 years of 3-yr grow calculations I would have to create 15 formulas!  Yikes!

It was possible, but there were two big problems:

  1. Each year you would have to add more formulas because the calculations weren’t dynamic by year.
  2. You lost the ability to visualize the data in context.  In this case you would not be able to easily show these calculations in a YoY line chart.

Could blending be the answer?  Possibly.  Here’s what I discovered.

Previous Blending Helped But…

Suddenly I had an idea.  What if I loaded the dataset in twice?  Once for the current year and again but leverage “Year+3”.  This would allow me to join on year and because the value for the year 2011 now say 2014.  I could then display the current years value next to the value from 3 years ago.  It worked!

Here is what I did.

I loaded the same dataset twice and renamed them to Current Year and CY-3.

Two imported datasets

In the dataset named, CY-3, I created a calculated dimension called Year+3 in which I added “+3” to the value of the Year column… so now 2014 became 2017, 2013 became 2016, etc.

Year+3 Calculated Field

In the previous example when I created Year+3 and added 3 to the Year, the field went from being defined as an integer to a real number.  Lumira doesn’t allow you to join across data types and since my original Year column is still defined as an integer, I needed to create a current date which is also a real number.  I accomplished this by creating a new calculation dimension called Year+0, by creating a new formula Year+0.

Year+0 Calculated Field

Next I lined the two datasets.  I was able to link Year+3 from the CY-3 dataset to the Year+0 from the CurrentYear dataset.

Dataset Linking

I also renamed the TransCount field in the CY-3 dataset to Transcount (CY-3) to avoid any confusion with the value TransCount in the CurrentYear dataset.

Next I created a column chart visualization showing the data from CurrentYear and the CY-3 datasets together in the same chart in the following manner:

First, Add Year+0 and TransCount to the Column chart.  This shows you the current values.  Next, change the dataset you are working with to CY-3 and add TransCount (CY-3) to the output.  Now I can see the CurrentYear Transcount compared to the Transcount from 3 years earlier.  As expected, there are null values for the earliest 3 years because I loaded the same data set twice and my initial calculation is using Year+3.

Viewing YoY Volume (TransCount)

In other words, if I hover over the value for 2004 I will see it matches the Transcount(CY-3) value for 2007.

Next I want to Exclude the 2004 through 2006 numbers.

Excluding Years with null values

So now I am left with the correct bar chart.

Viewing YoY Volume (with nulls excluded)

The next step is to now do some math between the two datasets.  For example I wanted to calculate the variance.  What is the difference between CurrentYear and CY-3?  Nope.  I needed to calculate the growth rate using the formula introduced above.  The problem was in Lumira 1.28, there was no way to do this. Now you can!

Custom Calculations Across Datasets

As soon as I got my hands on Lumira 1.29 I was able to find the new Custom Calculation feature two places.  You can either:

  • Use the menu at the top of the charting area
  • Right Click on the measure and select Add Calculation >> Custom Calculation
 
From the Menu From the Measure

So let’s create this new custom calculation.  Below you can see where I’ve highlighted a new and critical change.  You can now select the “Dataset” of the fields you want to use in the calculation.  This allows me to choose measure elements across both data sources:

Ability to select either dataset

So now I can create the following new growth calculation:

Power({DS1.TransCount}/{DS2.Transcount (CY-3)},1/3) – 1

Once I add this new YoY calculation to the chart and remove the previous measure values I can see the following results:

Dynamic 3yr Growth YoY

Now I have a dynamic growth calculation and I didn’t have to create dozens of formulas!  It works beautifully.

Now the only existing limitation is that I cannot show the 3yr growth and the 5 yr growth in the same chart because each of them is linked to a different combination of results.  (Year linked to 3Year) and (Year linked to 5Year).  There is currently no way to linked more than two data sets together.

Exercise for You

If you would like to try this exercise out for yourself, I have attached the associated data file here.  All you will need to do is load the dataset twice and then create the calculations as described above.

Now I’d like you to calculate the growth of Volume using the Growth Formula stated above, but this time instead of doing it for 2 yrs, do it for 5 yrs.  Once you’re done, you should get a chart like this:

Dynamic 5yr Growth YoY

Wrap-up

Lumira 1.29 is a huge leap forward.  The ability to do data blending and create new dynamic calculations across multiple datasets is very important capability indeed.  This goes a long way to remove many of the limitations you may have encountered around data preparation and analysis thus far.

So the next question is… when will they be releasing Lumira 2.0? 🙂

Enjoy!

«Good BI»

SAP Discusses Latest BI Roadmap

August 15th, 2013 3 comments

#AllAccessAnalytics

Yesterday, many SAP BusinessObjects customers, mentors, partners and even a few competitors listened in as Steve Lucas as he unveiled SAP’s bold, new analytics strategy.  The twittersphere as active as tweets were tracked using #allaccessanalytics, we had a lot of fun.  Zimkhita Buwa quipped:

Steve emphasized that SAP isn’t delivering a ‘mission accomplished’ banner.  It’s just a ‘mission’ banner.  “Where we are and where we are going.”  There is a bunch still to come…  One of the first bold things he did announce was that the personal edition of SAP Lumira is free and you can download it now.  In addition you can also register for free access to the SAP Lumira Cloud.  You can register for free here.

New Mindset

It was clear to hear Steve’s passion for analytics.  His passion goes way, way back… and is always welcome.  In fact, I think this is an old picture of Steve’s car back in 2003.

He talked about BusinessObjects founder Bernard Liautaud and the rich legacy from which this SAP BusinessObjects BI Suite comes.  Steve took a note from Visha Sikka to say that SAP is and will innovate and that the Innovator’s Dilemma is crap.  Customers will continue to see great new innovations from SAP.

The new mindset from SAP delivers three pillars:

  • Enterprise Business Intelligence – For the Entire Organization
  • Agile Visualization – For the Business
  • Advanced Analytics – Data Science for Everyone

Enterprise Business Intelligence

With 60,000 customers, SAP continues to have to largest market share within the Business Intelligence space.  SAP BusinessObjects didn’t invent Business Intelligence but our experience in the space is very rich… and if not SAP BusinessObjects then who?  SAP is going to continue to build out capabilities for the Enterprise organization.  This is squarely focused on our SAP BusinessObjects product line.

At the same time the market has shifted and everyday business users want to be able to connect to their data quickly and easily and get new insights and share those insights with colleagues… and perhaps not have IT involved at all.  That brings us to agile visualization.

Agile Visualization

In SAP’s mind, Agile means it’s incredibly easy to adopt and deploy.  It should be light-weight.  Visualization means that its high quality and there is storytelling behind it.  The visualization tells a story.  It gives new insights.

So is this just Lumira?

No.  Steve was clear about this.  This is not just Lumira.  Lumira and Lumira cloud are a part of the agile visualization strategy, but the agile visualization strategy.  He promised that more would be shared at TechEd, etc.  I think this is great news!  We are beginning to see integration today between Lumira Cloud and on-premise systems and it sounds like this will continue.

Advanced Analytics

Steve said that this is more than just predictive.  As with Lumira, Predictive is a part, but there’s more.  SAP’s view is that this is not just for data scientists but data science for everyone.  It sounds we might begin seeing more ‘smart’ functionality build into the analytics.  We’ve already seen predictive leveraging the interface of Lumira for easy data access.  I can definitely envision a lot of possibilities here.

Key SAP Executives

Michael Reh (@reh_michael) – is leading development.  Passionate about analytics.

Christian Rodatus (@crodatus)  – Go to market executive for analytics.  18 years at Teradata.  Brings big data perspective.

Shekhar Iyer (@shekharlyer111) is leading the BI organization.  Brings a predictive perspective.

Jack Miller (@jackmillerIII) is Chief Customer Officer – in charge of generating successful, happy customers.

Jayne Landry (@jaynelandry) is Crystal Kiwi, working closely with Shekhar.

Other Highlights

SAP BusinessObjects BI Suite 4.1 is out of ramp-up.  All the KPIs were hit, so I would expect a GA release soon with the release of SP1.

There was a great video of a WebIntelligence-like product that was running on top of HANA.  It was written completely in HTML5.  I’ve never seen this before.

Screenshot of HTML5 WebI-like Tool

There was another nice video of Lumira doing geospatial analysis using the new Lumira Visualization Extensions which were released with SP11.  Timo Elliott recently did a nice blog post talking about this topic.

Lumira with Geospatial Capabilities

Lumira with Geospatial Plug-in

On September 9th, SAP is planning to launch a new BI online support site.  It looks as if they are following in the footsteps of the HANA launch site.  They briefly showed a mock-up of what it might look like.

Magic Bus

Steve revealed the new bus.  Yes, it’s a literal Big Data Bus.  SAP will be rolling out a mobile briefing center that will be used to showcase SAP’s latest and greatest. I think it’s one of SAP’s ways of saying there’s  50 ways to leave your… niche BI tools.   So hop on the bus Gus!

I couldn’t resist.

My Thoughts…

If you are an SAP customer and haven’t yet purchased SAP BusinessObjects, there is no better time than now.  The integration between SAP and the BusinessObjects BI Suite is second to none.  Here are a list of just a few of the unique advantages you can leverage when reporting against SAP ERP and SAP BW with SAP BusinessObjects:

  • Support for SAP CTS+
  • Integration with Solution Manager
  • Support for RRI (Report to Report Interface)
  • Support for BICS (3x faster than legacy BAPI interface)
  • Best Heterogeneous data source support
  • Best Slice/dice performance within MS Excel
  • Embedded Analytics within SAP Business Suite EP6
  • Crystal Reports options for Advanced List Viewer (ALV)
  • Semantic layer support for Infosets, ABAP functions and ABAP Queries
  • 100% In-memory support for all your SAP data

If your organization is committed to SAP Business Suite, then leveraging SAP BusinessObjects to provide reporting off those solutions is a no brainer.

Secondly, have a look at Predictive Analysis.  Although this product is relatively new, SAP has come a long way very quickly.  SAP has combined the core self-service Lumira (Visual Intelligence) product together with the power of R to deliver world class predictive analytics to the data analyst.  The interface is extremely easy to use and if you haven’t seen it, check out the post I did where I provided a product walk-through.  It may not necessarily replace SAS today, but it can deliver tremendous value by shortening the length of time it takes data analysts to build, model and run predictive algorithms.  Users are no longer wholly dependent on the small number of statisticians to provide predictive  and statistical analysis.  Predictive Analysis is a game changer.

Thirdly, get familiar with SAP’s simplified licensing.  Back in the day, when BusinessObjects was just one product, licensing was easy. Over the years as the BusinessObjects BI portfolio has grown, not everyone was ready to leverage the new technologies such as WebIntelligence, Dashboards, Explorer, etc.  As a result, BusinessObjects allowed customers to buy products à la carte to keep the pricing competitive.   A lot has changed.  Today, Business Intelligence is ubiquitous.  Everyone needs it and organizations should want to leverage the same solution for multiple types of users who have different analytic needs.  Back when the only product was had was Crystal Reports, I used to show how Crystal Reports provided enterprise reporting, adhoc reporting and dashboards. SAP’s approach was to simplify this licensing through bundles.  At the beginning of 2013, SAP offered BI Suite licensing which provided two important changes:  concurrent user licensing and a powerful software bundle of nearly every product in the SAP BusinessObjects Business Intelligence Product Suite.

Conclusion

This #allaccesswebinar didn’t answer all our questions but one thing was clear:  SAP is fully committed to an easy-to-adopt analytics product suite for all users that serves the enterprise through both on-premise and cloud.  They are committed to delivering solutions that: compete head-to-head against the newcomers, deliver customer value and are agile and easy to adopt and use.

If you want more information on the latest published roadmaps from SAP, go here.

Now… hop on the bus Gus!

«Good BI»

Migrating BEx-based Information Spaces to BusinessObjects 4.0

August 8th, 2013 3 comments

Sometimes in software it can feel that you take two steps forward and one step back.

One example of this is with BEx-based Explorer Information Spaces when migrating from XI 3.1 to BO 4.0. In 4.0, SAP removed the ability to create information spaces against the legacy universe (unv) and instead required everyone to use the new universe format (unx).

On the surface this lack of support for the legacy universe didn’t seem to really be a problem because BusinessObjects 4.0 provides a painless way to migrate universes from the legacy unv to unx format.

What I didn’t realize initially until a few months ago was that this WAS be a big problem.  A big problem for customers who were creating Information Spaces based on BEx queries and Infocubes.

I’d like to share with you my journey to understand what is really required to move my Bex-based Information Spaces to BusinessObjects v4.0.

Explorer, BEx and Unx

Explorer is NOT an OLAP-aware product, therefore is doesn’t understand hierarchies, so in XI 3.1 the unv would flatten the hierarchies for you can generate a flattened hierarchy as L00, L01, L02, L03, etc. There are some advantages to this approach, but there are obvious drawbacks as well.

With BusinessObjects v4.0, SAP rolled out the idea of a transient universe, such that if you wanted to create a WebIntelligence report you didn’t have to create a universe first. WebIntelligence would create a universe on the fly and a hierarchy is treated as a single column with the ability to use +/- to expand and collapse. (You can read more about WebIntelligence and BICS connectivity here.)

If  you try and convert an XI 3.1 universe based on a BEx query to a unx, it gives you the following error:

Now What?

The only 2 options I came up with to overcome this challenge were:

  • Use WebIntelligence (unv) to generate an Excel file and have Explorer index the xls file.
  • Leverage a Multi-Source relational connection to  connect to the the BEx cube and hierarchy relationally

Approach 1 – Excel Output

The approach I took here was to use the legacy unv file to create a WebI report.  I would then schedule this WebI report and generate an Excel file.  The Excel file would overwrite to ‘source’ Excel file of the Information Space.

To set this up, I created the WebI report with no header and a table that starts at position (0,0).  This table will easily export to Excel.

Sample WebI output prepped for Excel

Next, export the results to Excel 2007 format (xlsx)

Resulting WebI output in Excel

I then uploaded the xls file the BI Launchpad and used it as a source for my Information Space.

 

Once Explorer was able to generate the information space the way I wanted it, I was read to schedule the WebIntelligence report to Excel output.

Information Space based off an Excel file

Now, look at the properties of the Excel file, because we will need this when scheduling our WebIntelligence report.

Find the FRS location of the Excel File

I can now schedule WebIntelligence to run on a schedule and since I know the physical location of the Excel file in the FRS.

In my case the directory is: D:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\FileStore\Input\a_053\035\001\74549\   and the file name is:  work order excel-guid[e93a2669-0847-4cd2-b87d-390beb1c956c1].xlsx.

I can use that file name in the WebIntelligence scheduler and write over the source xlsx file for my information space.

When scheduling the file make sure and select the following:

  • Recurrence – chose the frequency that makes sense for you.
  • Formats – chose Excel.   (The default is always WebI.)
  • Destinations – chose File System.  Do NOT keep an instance in the history.  Use the xlsx directory and filename.

Select the correct parameters in the recurring schedule

As a last step, I would then schedule Explorer to reindex the Information Space after the file was changed and all will be good.

Scheduling Indexing for Explorer

Now the information space should be generated and everything should work just fine.

I must say that at first I really struggled to get this technique to work.  I’m not sure if my WebI output file was too large or if there was some other type of problem.  I spent hours (and hours and hours) trying to troubleshoot why this wasn’t working.  After speaking with technical support and having them attempt the same process, we thought that there was some type of incompatibility between the Excel files generated by WebIntelligence and the Excel format required by the Information Space for indexing.

There were a couple of different errors I encountered.  A few times I forgot to specify Excel as the output type so I got this error:

The only way I was able to fix the error was to restart my Explorer services and reschedule the WebI report and make sure it was exporting in Excel format.

Another error I got was when I didn’t properly exit the information space before attempting to reindex it.  In that case I would get an indexing error and see this when opened up the information space through “Manage Spaces”.

I found that this method of solving the issues was somewhat flaky.  Sometimes I found that after WebIntelligence overwrote the Excel file, Explorer is not longer able to index it.  It’ was very frustrating and technical support was only able to provide limited support because this is not the recommended solution for the problem.

So what did SAP recommend?  They suggested a much less elegant but more robust and fully supported approach — a multi-source universe.

Approach 2 – Multi-source Solution

This solution is less straightforward, but I was able to get it working and SAP says that this is the only solution that’s officially supported.

There are three things we need to do:

  1. Generate the flattened hierarchy lists and load them into another database (e.g. SQL Server)
  2. Read the SAP Note 1656905 about creating a unx universe from a BW Infocube
  3. Link the two systems via a multi-source connection

In order to generate the flattened hierarchy, you must use the legacy unv universe against your Infocube.  The ability to flatten a hierarchy is not available in a unx universe.  (SAP says that BICS is not there to flatten the hierarchy and there are no plans to enable it because then it’s no longer a hierarchy.  Bummer.)

Next, create a WebIntelligence report based on a legacy unv universe.  Add all levels of the hierarchy to the report and run the report.  Once you have the report, export the results to an Excel file and load them into a relational database.

I created a table called: tblPCHier

Flattened BW Hiearchy in SQL Server

Next, I imported the Excel output into my new database table:

BW Hierarchy is loaded in SQL Server

Note:  You need to watch out for accidental duplicate names a lower levels of the hierarchy.  Because WebIntelligence will automatically try and aggregate multiple values, you need to be aware that if the child nodes have the same name but a different parent value, the child nodes will roll up and display and aggregated value within Explorer.  If this is not what you want, then you will want to make sure that the child node names are unique.

Next we are ready to go into the IDT (Information Design Tool) and create our multi-source universe.  Follow the instructions listed in the SAP Note 1656905 to understand how to create a unx on top of a BW Infocube.

Once our BW star schema has been added to our data foundation, we can add another connection to our other datasource, the relational database, so we can bring in our hierarchy.

Lastly, join the column from the BEx fact table (SAP) to the key of my hierarchy table (SQL Server).

When our multi-source universe is complete we should see a connection to SAP, a connection to a relational database, a data foundation and a universe.

Completed unx components

Here is a preview of my hierarchy table from within the IDT:

View of flattened Hierarchy

So now we just need to verify that everything we need is in the universe.  The big challenge being that not everything from BEx is available in a unx.  Here are some of the things we lose when we go to a relational universe:

  • Calculated Key Figures
  • Restricted Key Figures
  • Variables
  • Conditions
  • Unit / Currency Conversion
  • Hierarchies (which we know about)
  • Custom Structures
  • Time Dependent Objects

I suggest you commit this list to memory.

In one case I had over 50 calculated key figures that I needed to map into Explorer and therefore recreating the logic in SQL was difficult and tedious.

In that case I had measures that included some time dependent calculations:

  • Total AR
  • Current AR, Over 30 days, Over 60 days, Over 90 days, Over 120 days
  • Current Debit, Debit over 30, Debit over 60,  Debit over 90,  Debit over 120
  • Current Credit, Credit over 30, Credit over 60 and Credit over 120

In BEx, I had implemented exit variables to do dynamic time calculations.  Now I need to do the same thing for Explorer.

To accomplish this, I built SQL Server Views which dynamically calculated values such as Last Day Previous Month and Last Day Previous Month Previous Year.  I could then use these dynamic calculates in my universe.

Equivalent userexit logic in SQL Server

Although I included these views in the data model, I did not need to join them to anything.

These views were simply used to dynamically generate a date value which was used to restrict the queries to the correct data elements.

Here is a look at the measures that were created in the universe (click on the image to enlarge):

Measures within the Universe

Here is a screenshot of the WHERE logic for “Total AR, Last Month”:

WHERE Logic for restricted key figure

Here is some of the logic spelled out.

WHERE logic for “Total AR” that leverages curDate()

@Select (ZFIAR_C01\Clearing/Reverse/Posting Dates\Posting date in the document) <= curDate()
OR
(Select (ZFIAR_C01\Clearing/Reverse/Posting Dates\Clearing date) > curDate()
OR
Select (ZFIAR_C01\Clearing/Reverse/Posting Dates\Clearing date) < {d ‘1900-01-01’}
)

WHERE logic for “Total AR, Last Month” that leveages Last Day Prev Month view

@Select (ZFIAR_C01\Clearing/Reverse/Posting Dates\Posting date in the document) <= @Select(SQL\V Last Day Prev Month\Last Day Prev Month)
OR
(Select (ZFIAR_C01\Clearing/Reverse/Posting Dates\Clearing date) <= @Select(SQL\V Last Day Prev Month\Last Day Prev Month)
OR
Select (ZFIAR_C01\Clearing/Reverse/Posting Dates\Clearing date) < {d ‘1900-01-01’}
)

If you have to do a lot of this type of time-based calculation logic, you might also want to review some of my previous blogs on the topic.  You don’t necessarily have to create views in the database to do the time calculations:
http://trustedbi.com/2009/12/29/timeseries1/
http://trustedbi.com/2010/01/06/timeseries2/
http://trustedbi.com/2010/01/18/timeseries3/

Cavet

This method of implementation is not for the faint hearted.  I can potentially mean a lot of work.

I would like to highlight some important considerations:

  • If your hierarchies are change on a regular basis, you will need to automate the updating of the SQL Server table which contains the hierarchy.
  • If you have a lot of calculated key figure which will need to be recreated within SQL.
  • Any logic you built into variables or user exits may need to be recreated within SQL.

Given these constraints it’s hard for me to endorse converting all your Explorer information spaces to BusinessObjects v4.0 without first understanding the complexity of your Infocubes. The good news is that SAP BusinessObjects v4.1 will overcome these limitations.

Try the Excel method first.  It’s worth a shot.

 BusinessObjects v4.1 to the Rescue

Recently in a What’s New in Explorer 4.1  ASUG call, SAP announced that in BusinessObjects 4.1, unv files will be supported.  This is great news.  Of course that begs the question.  If unx is the way forward, how will we flatten our SAP hierarchies in the future?

SAP BusinessObjects 4.1 is currently in ramp-up and the latest information on Service Marketplace says that it is targeted to exit ramp-up on November 10, 2013.  As always, this date is subject to change.

On additional piece of advice, if you are curious to learn about future releases and maintenance schedules, I suggest you check out this site on Service Marketplace: https://websmp108.sap-ag.de/bosap-maintenance-schedule Although these days are only planned dates, they are useful to have when planning system maintenance and upgrades.

Hope you’ve found this helpful.

«Good BI»

Categories: Lumira Tags: , , ,

Virtualizing SAP BusinessObjects BI 4

June 18th, 2013 1 comment

Last year I wrote a quick article about Virtualization support for BusinessObjects.

Since that time, SAP has been doing a lot of testing and refinement to guidelines regarding virtualizing your BI 4 environment.

Originally only advertised to a handful of attendees at Sapphire the www.sap.com/bivirtualization link has had over 1,000 views per month.

This is the official guidance you should be pointing all customers, partners and employees to when it comes to BI 4 virtualization.  This document can be used by your BI team to make sure and negotiate the right sized infrastructure from your IT team for a large scale move to BI 4.  Don’t get caught short.

Ashish Morzaria has done a great job and putting everything together in one place.   He’s collected feedback from actual customers together with performance tests that have been run internally at SAP.  All this information is put together in a 42 page everything-you-need-to-know whitepaper on VMWare ESXi 5:
http://www.sap.com/bivirtualization

Check it out!

«Good BI»

Installing BusinessObjects v4.0 – CMS Database

January 5th, 2012 No comments

I’ve installed BusinessObjects about a hundred times and there is very little that’s changed about the installation wizard from a user interface perspective since Crystal Enterprise 10.  BusinessObjects has always included “in the box” all the components necessary to successfully install BusinessObjects for a single server configuration.

That said, there is ONE change I make every time I do an installation.

History of the Embedded Database

On Windows, it’s gone from SQL Server Embedded (CE10) to MySQL (for support of Unix and Linux) and back to SQL Server.  Now that SAP has acquired it’s own database technology, don’t be too surprised if it comes bundled with Sybase in the future.

Personally I’ve never liked uses the embedded database and I wouldn’t recommend you use it either.  In fact, I recently has a situation with a client who due to any overly restrictive server/firewall configuration was unable to get the embedded database working and we wasted hours trying to troubleshoot the problem.

Installation Best Practice

I always choose “Custom Install” so that I can:

  • Modify the installation location
  • Uncheck the default embedded database (for the CMS)

I really don’t like to include the embedded database because I want to give BusinessObjects as much on-server resources as possible – especially with v4.0.

Always create space in an existing database environment to support BusinessObjects.  There are many supported CMS databases including:  SQL Server, MySQL, IBM DB2, Oracle, MaxDB and Sybase.

NOTE:  Always test connectivity to the database from the server on
which you will be installing BusinessObjects to make sure the connectivity
is working.

During the installation you will want to NOT include the embedded database.  That means doing a CUSTOM install and deselecting Integrated Database.

No Embedded Database

De-Select the Integrated Database

What I love about the installer is that it will check the database connectivity before the installation begins.  If there is an issue with the database client configuration, permissions, etc., the installation will warn me of the situation and not continue.  This gives me the confidence to know that assuming I have enough hard drive space, when I select “Begin Installation”,  it will complete successfully.

«Good BI»