• Skip to main content

IBM Blueview

Cognos Analytics and all things IBM

  • The Blog
  • Cognos Glossary
  • Cognos Resources
  • About Me
  • Categories
    • Cognos
      • Data Modules
      • Administration
      • Framework Manager
      • Dashboards
    • Opinion
    • Community Spotlight
  • PMsquare
  • Subscribe

The Blog

Data Modeling for Success: BACon 2020

October 15, 2020 by Ryan Dolley Leave a Comment

Getting Started

Logging in and creating a folder for your work

  1. Because of high interest in today’s session we are using a single 96 core 768 GB RAM Cognos server with anonymous access allowed. This makes logistics much easier but means we don’t have access to the ‘My content’ feature of Cognos.
  2. Click here to access Cognos: http://3.216.29.72:9300/
  3. Click ‘Team content’ and navigate to the ‘BACon Users’ folder.
  4. Click the ‘+’ button in the navigation window and select ‘Folder’.
  5. Give your folder a name with the following format: Initials – favorite movie – favorite color. In my case it would be ‘RPD-Solyaris-Green.’ Hopefully this ensures a unique folder for everyone.
  6. We are now ready to begin today’s class.

Organizing for Self Service with Data Sets

In our first example we will prepare a data set for self-service. This data set is sourced from an existing Framework Manager package and is a good example of a technique PMsquare often uses to simplify presentation and improve performance for end users. We will cover the basics of creating a data set, optimizing its performance and using it in a dashboard.

Creating a data set off a relational package

  1. Click ‘Team content’ on the left side of the screen and Navigate to Team Content>BACon Modules.
  2. Hover over ‘Go data warehouse (query) package and click the ‘More’ button.
  3. Select ‘Create data set’ from the ‘More’ menu. The data set screen will launch.
  4. The data set screen might look familiar – it is a stripped down version of Cognos Analytics report authoring. You should see a ‘Source’ section with the ‘Go data warehouse (query)’ package loaded, and an area that reads ‘Add data here’.
  5. Expand the ‘Sales Target (query)’ folder under ‘Insertable Objects’.
  6. Expand the ‘Sales Target (query)’ namespace located within.
  7. From the namespace, drag the following fields to the ‘Add data here’ section of the screen. A list will appear and populate with data from the package:
  8. Retailer.Region
  9. Retailer.Retailer country
  10. Retailer.Retailer name
  11. Retailer.Retailer code
  12. Employee by region.Employee name
  13. Employee by region.Position name
  14. Time.year
  15. Time.Month key
  16. Time.month
  17. Time.month (numeric)
  18. Sales target fact.Sales target
  19. Click the ‘Page design’ drop down and select ‘Page preview’.
  20. Turn off  ‘Summarize detailed values…’ and observe the effect of summarization on the data displayed in the list. The underlying data is at the level of individual Target transactions.
  21. Turn detail summarization back on.
  22. Click the ‘Year’ column and select ‘edit layout sorting’ from the sorting drop down menu in the toolbar.
  23. Drag ‘Year’ and ‘Month (numeric) to the ‘Detail Sort List’ and click OK. We have sorted our data set by year and month. Sorting on commonly used columns can improve performance.
  24. Click ‘Position name’ and select ‘Create custom filter’ from the filter drop down menu in the toolbar.
  25. Click ‘Condition’ to access the conditional filter screen
  26. In the ‘Input condition’ next to ‘Contains’, type ‘Sales’. This will filter the Position Type column for values that contain the word ‘Sales’. Removing unnecessary rows or columns can improve performance.
  27. Click the ‘Save’ icon and select ‘Save as’.
  28. In the ‘Save as’ window, navigate to ‘Team Content > BACon user>your folder. This is the folder you created in step 5 of Getting Started.
  29. In the ‘Save as:’ text box, type ‘Sales Target Data Set’.
  30. Click ‘Save’. The ‘Save as’ window will close.

Finding and loading a data set

  1. Click the ‘Home’ icon to return to the Cognos Analytics welcome screen.
  2. Click ‘Team content’ on the left side of the screen and navigate to ‘Team Content>BACon users>your folder’.
  3. Hover over ‘Sales Target Data Set’ and click the ‘More’ button.
  4. Select ‘Reload’. The Data Set loading notification will appear at the top of the screen. It should load and disappear fairly quickly.
  5. In the ‘More’ menu, select ‘Properties’.
  6. In the ‘Properties’ menu, notice the created, modified and data refreshed dates & times in the upper right.
  7. In the ‘Properties’ menu, expand ‘Advanced’ and scroll down to see statistics about this data set:
  8. Size
  9. Number of rows
  10. Number of columns
  11. Time to refresh
  12. Refreshed by
  13. Close the properties menu.

Dashboard Creation

  1. Click the ‘More’ button next to ‘Sales Target Data Set’ and select ‘Create dashboard’ in the ‘More’ menu.
  2. Select the fourth available template in the ‘Create dashboard’ screen. It has a large rectangle in the center with four rectangles equally spaced at the top.
  3. Click ‘Okay’. The ‘New dashboard’ screen will load.
  4. On the left side of the screen, click on the ‘Assistant’ icon.
  5. Type ‘Show average sales target for country’ and hit enter. Note that Sales target total is a sum measure, not an average. Cognos is able to compute new aggregate types on the fly using natural language query.
  6. Click and drag the map that appears into the ‘drop here to maximize’ icon in the center of the screen.
  7. Drag ‘Retailer region’ next to the ‘This tab’ icon at the top of the screen to add it to the tab filters.
  8. Click the ‘Retailer region’ filter and select ‘Northern Europe’. The map will zoom to extent.
  9. Add additional visualizations to the dashboard as you see fit.
  10. Save your dashboard as ‘Employee Target Dashboard’.

Combining data sets with other sources using data modules

Data sets alone are great for simple requests; for more complex modeling requirements containing multiple tables, complex relationships or relative time you are better off creating a data module. Data modules can contain multiple data set, data base or excel data sources and are great for IT and power users alike.

Creating a data module

  1. Click the ‘Plus’ button in the lower left side and choose ‘Data Module’
  2. In the ‘Select sources’ menu, select ‘Team Content’. Navigate to ‘your folder in ‘BACon users’
  3. Select ‘Sales Target Data Set’ and click ‘OK’. The data modules screen will open.
  4. Click the table ‘Sales Target Data Set’ in the source view. This will load a preview of the data from your data set into the ‘Grid’ tab. Here you can easily make comparison for join compatibility and instantly see the results of any filters, groups or calculations you build

Adding additional data sources

  1. Click the ‘+’ icon in the data source view and select ‘Add new sources’
  2. Click ‘Data servers and schemas’ and click ‘GOSALESDW/gosalesdw’. Click ‘OK’.
  3. Click ‘Discover related tables’ and explore the natural language model generation capability a bit. Then click ‘Previous’
  4. Click ‘Select tables’ and click ‘Next’.
  5. Choose ‘Go Time Dim’ from the available sources and click ‘OK’. This will add the Go Time Dim table to our data module.
  6. Repeat the above process to add:
    • Great_outdoor_sales data server
      • GOSALES/gosalesrt schema
        • Retailer table
        • Retailer Site table
        • Retailer Type table
    • Team content>BACon>BACon files
      • Sales data excel.csv
  7. At this point you should have six tables in your data module:
    • Sales Data Excel.csv
    • Retailer
    • Retailer Site
    • Retailer Type
    • Go Time Dim
    • Sales Target Data Set
  8. Save the model as ‘Sales Analysis Date Module’ in ‘your folder’.

Data cleanup and prep

  1. Click the arrow next to the ‘Go Time Dim’ table to expand it. Notice the large number of fields with a ruler icon. Cognos has incorrectly identified these as measures.
  2. Ctrl-click each ruler icon the click the ‘properties’ icon in the upper-right corner of the screen. This will open the properties window.
  3. Change ‘Usage’ from ‘Measure’ to ‘Attribute’. Change ‘Represents’ to ‘Time’.
  4. Click the ‘Month De’ field then shift-click the ‘Weekday Tr’ field. This will select all the non-english fields in the table.
  5. Click the ‘more’ button and select ‘Remove’. This removes these fields from the model.
  6. Repeat this process in the ‘Retailer Type’ table to leave only ‘Retailer Type Code’ and ‘Type Name En’.
  7. Save your work.

Simplify table structure

  1. Click the ‘more’ button next to ‘Sales Analysis Data Module’ in the source view. Select ‘Table’ in the menu that appears.
  2. Click ‘Select Tables’ and select the three retailer tables.
  3. Click ‘Create a view of tables’ and click ‘Next’
  4. Rename the new table ‘Retailer Dim’
  5. In the ‘Selected Items’ menu, select the following fields:
    • Retailer Code
    • Company Name
    • Rtl City
    • Rtl Prov State
    • Rtl Country Code
    • Type Name En
  6. Click ‘Refresh’ to check your work. Click ‘Finish’ to create your new table.
  7. Examine your handiwork in the ‘Custom tables’ tab. This tab makes understanding the data flow in your model much easier.
  8. Save your work.

Create relationships

  1. Ctrl-Click ‘Sales Data Excel.csv’ and ‘Go Time Dim’.
  2. Right-click one of them and choose ‘Relationship’ under the ‘New’ header. This will open the new relationship screen
  3. Click each table to select it, then right click and select ‘Create relationship’. The create relationship screen will open.
  4. Ensure the 1 indicator is under ‘Go Time Dim’ and the N indicator is under ‘Sales Data Excel.csv’. You can swap these by clicking the wheel icon in the lower left corner of the screen.
  5. Select ‘Day key’ from each table and click ‘Match selected columns’ This will create the necessary join criteria.
  6. Click ‘Refresh’ to see a preview of your join.
  7. Click OK to build the join.
  8. Repeat this process to create the following joins:
    • Go Time Dim 1-N Sales Target Data Set
    • Sales Target Data Set N – 1 Retailer Dim
    • Retailer Dim 1 – N Sales Data Excel.csv

Refining and enhancing the model

Adding additional business logic can greatly enhance the value of your data module and make it much easier for professional and self-service authors to quickly create valuable content.

Creating calculations

  1. Ctrl-click ‘Quantity’ and ‘Unit sale price’ in the Sales Data Excel table.
  2. Click the ‘more’ button and choose ‘create calculation’.
  3. Name your calculation ‘Sale Total’ and select ‘x’ from the calculation drop down menu.
  4. Note the ‘Calculate after aggregation’ option on this screen.
  5. Click ‘OK’
  6. Repeat the steps above to create the following calculations:
    1. Cost total: Quantity x Unit cost
    1. Sale margin: Sale total – Cost total
    1. Plan variance: Sale total % Planned revenue
  7. Click the ‘more’ button next to the Sales Data Excel table and select ‘calculation’. This will open the advanced editor.
  8. Name your calculation ‘Sales Margin’.
  9. Copy-paste the following logic:
    1. (Revenue – Cost_Total)/ Revenue
  10. Click the ‘more’ button next to ‘Sales margin’ and select ‘Format data’.
  11. Select ‘Percent’ in the drop down. Click ‘OK’.
  12. Ctrl-click ‘Cost total’ and ‘Sale total’. Click the ‘more’ button and select ‘format’
  13. Select ‘Currency’ in ‘Format type’.
  14. Select ‘USD’ in the ‘Currency’ drop down.
  15. Click ‘OK’

Adding drill paths

  1. Expand the ‘Go time dim’ table and click the ‘more’ button next to ‘Current Year’.
  2. Select ‘Create navigation’ path.
  3. Name the navigation path “Time’
  4. Drag ‘Current Quarter’, ‘Current Month’ and ‘Day Date’ into the navigation path. Click ‘OK’
  5. In the source view, click the ‘Identify navigation path members’ button.
  6. Repeat the process to create the Retailer location path:
    1. Rtl Country Code
    1. Rtl Prov State
    1. Rtl City
  7. You can create any hierarchy you want – feel free to make something crazy!

Adding relative time

  1. Click the ‘Add new sources’ button and select ‘Add new sources’.
  2. Navigate to ‘Team content>Calendars’ and select ‘Fiscal calendar’. Click OK. The Fiscal Calendar table should be visible in your module.
  3. Expand ‘Go Time Dim’ table and click ‘Day date’. Open the column properties.
  4. Under ‘Lookup reference’ select ‘Fiscal Calendar’
  5. Expand ‘Day Date’ and note the large number of relative time filters.
  6. Expand the ‘Sales Data Excel.csv’ table and shift-click the calculations we made above.
  7. Click the ‘more’ button and open the properties
  8. Set the lookup reference to Go Time Dim.Day Date.
  9. Expand any of the calculations and note the large number of relative time calculations.

Cleaning up for consumption

  1. Open the properties for ‘Sales Data Excel.csv’ and change the name to ‘Sales’.
  2. Change ‘Sales target data set’ to ‘Sales target’.
  3. Click the ‘more’ button next to ‘Sales Analysis Data Module’ and select ‘Folder’ in the ‘New’ section.
  4. Name the folder ‘Retailer tables’
  5. Click and drag the ‘Retailer’, ‘Retailer Site’ and ‘Retailer Type’ tables into the ‘Retailer tables’ folder. This will nest the tables within the folder.
  6. Click the ‘more’ button next to the ‘Retailer tables’ folder and select ‘Hide from users’.

Managing modules

Securing models

  1. Click the ‘Source view’ button.
  2. Expand ‘gosalesrt’ data server.
  3. Click the ‘more’ button next to ‘Retailer’ and select ‘set data security’.
  4. Click ‘Add security definition’ in the properties window. This will open the security window.
  5. Name your security filter ‘Analytics users’
  6. Expand the ‘Cognos’ namespace and select ‘Analytics users’. Click ‘OK’.
  7. Selected ‘Company Name’ from the Filters drop down and click ‘Add a filter’. The filter window will open.
  8. Select ‘4 golf only’ and click ‘OK’
  9. This sets the data level security at the data server level, not on the level of your individual model! Changes made here will effect ALL models built on this data server, for better or worse.
  10. Click ‘Cancel’ to leave the security window.

Model inheritance

  1. Click ‘New’ and select ‘Data module’
  2. Navigate to the ‘Sales Analysis Data Module’ located in ‘your folder’. Click ‘OK’
  3. Notice how each table is now turquoise with a ‘link’ icon on it. These are linked tables and will inherit changes made to their parent module.
  4. Check the ‘more’ menu on the ‘Sales’ table. Note that many options are missing. The same is true for individual data items.
  5. Note the ‘Break link’ option in the ‘more’ menu. This will disconnect the linked table from it’s parent and it will no longer inherit changes. The ability to click this button can be turned off in security.

Conclusion

In this workshop we combined a data set sourced from a Framework Manager package tables from a Microsoft SQL Server database and an Excel spreadsheet into a single data module, which we made available to self-service and professional authors.

I hope this workshop has unlocked a deeper understand of how the Cognos Analytics data prep features can drastically reduce the time it takes to acquire, model and visualize data for both self-service users and IT professionals. As always, please contact me at rdolley@pmsquare.com with any questions!

Cognos Analytics 11.1.6 What’s New

April 27, 2020 by Ryan Dolley 7 Comments

Cognos Analytics 11.1.6 is live! This release features a ton of great quality of life enhancements. Chief among them is a great dashboard UI refresh, a new interactive data table in report authoring and some very welcome changes to data modules. There are also some shifts in IBMs design language with refreshed icons, and a great new ‘help’ section that pulls support directly into the Cognos UI. Overall I feel this is a strong release. So let’s take a look at what’s new in Cognos Analytics 11.1.6.

Cognos Analytics 11.1.6 UI and usability updates

Cognos Analytics 11.1.6 brings two changes to the UX that you will notice across all features and help keep Cognos looking fresh while enabling end users with much improved support features.

UI and design tweaks

Cognos Analytics 11.1.6 brings updates to icons, fonts and other items
Cognos Analytics 11.1.6 features more elements from the Carbon Design System

IBM continues to apply the Carbon Design System to Cognos Analytics. For those of you who are unfamiliar, this open source system for products and experiences drives the UX of many IBM products including Cognos, Planning Analytics, Watson Studio and Cloud Pak. For Cognos Analytics 11.1.6 this means first and foremost the adjustment of icons and fonts in the UI as well as more subtle tweaks located throughout. I personally think the new icons look fresh and appreciate the hard work of the IBM design teams in Toronto and Ottawa.

New ‘learn’ pane

The 'Learn' pane adds lots of end users support to the Cognos Analytics 11.1.6 UI
The new ‘Learn’ pane brings guides and videos directly into Cognos

The ‘learn’ pane replaces the ‘help’ section and basically does a 360 degree dunk in its face. The improvement here is dramatic. Accessed via the new learn icon in the upper right corner, the learn pane is a context dependent help section that surfaces guides, documents and even videos directly in the Cognos UI. It understands what feature you are currently using and suggests related content – in the example above it suggests data modules assistance when accessed from the data modules UI.

Cognos Analytics 11.1.6 Dashboard Changes

New dashboard UI

Dashboards receives a most welcome UI refresh. It is much easier to understand visualization composition and formatting options thanks to changes to context menus and a new ‘fields’ section. Subtle tweaks also bring the tool closer to the carbon design standard with new icons.

Cognos 11.1.6 dashboards received a great makeover
Dashboards receive a subtle yet significant makeover
  1. New icons in both the side menu and in the data tree. Cognos looks more and more modern with each release.
  2. The visualization properties are now pinned to the top of the screen like in reporting. This makes interacting with visualization so much easier!
    1. Left clicking on a visualization will highlight the visualization. The pinned visualization properties at the top of the screen will apply to the select visualization
    2. Right clicking on a visualization element like a bar, line, point or label will bring up the interactive options for just that element
  3. Access to options like linking visualizations, filters and properties has been grouped and simplified. There is also a new ‘Fields’ button
  4. The ‘Fields’ button brings up a new fields view, which shows the fields of the selected visualization. If you select a new visualization this section will change to reflect your new selection

This impactful redesign makes navigating dashboards so much easier compared to previous releases because you can always tell what visualization is selected and easily and quickly access all visualization elements. You no longer have to rely on hidden menus and focus mode, which has received a redesign.

Revamped focus mode

Focus mode is now for focusing on a visualization rather than making changes to it
Focus mode now functions as ‘full screen’ for individual visualizations

Focus mode has been repurposed and now functions basically as a ‘full screen’ button for an individual visualization. As a result of the general UI changes all the old focus mode features reside in the new ‘Fields’ view. Focus mode is now also available to dashboard viewers, not just in edit mode.

Expand/collapse in crosstabs

Expand/collapse in Cognos Analytics 11.1.6 dashboards brings back some old Power Play favorites.
Expand/collapse brings me fond memories of PowerPlay

This is one of those features that people have been asking for since 2005 and suddenly it has arrived! And better yet, it functions exactly like you hoped it would. Expand and collapse currently function only for OLAP sources – not for navigation paths made in data modules. I’ve been told this will be fixed in an upcoming release.

Enhanced unit formatting

Enhanced unit formatting further builds out the utility of dashboards.
Enhanced unit formatting brings additional flexibility to measure display

With enhanced unit formatting, dashboard users now have the ability to append a custom label to the end of measures displayed inside Cognos visualizations. For example, if you have a field that is measured in units you can label them as such.

View source in dashboards

It's much easier to source a field in Cognos Analytics 11.1.6
It is now possible to identify the exact field used in a visualization

This is a very welcome addition. A new popup appears whenever you hover over a data item in the fields view. This solves a major problem. In previous versions it was impossible to tell which table supplied a field in a visualization, leading to much confusion when there were multiple fields with the same name in a model.

Be sure to read Matt Denham’s great overview of Cognos 11.1.6 dashboards.

Cognos Analytics 11.1.6 Reporting Changes

Cognos 11.1.6 doesn’t contain a ton of reporting changes but the ones we get are very good. The data table in particular gives a ton of functionality to end users that we’ve been requesting for a looooooooong time.

Meet the data table

The data table is a new object in Cognos report authoring that provides the type of interactivity that end users crave without javascript hacks. This is possible thanks to changes in how the data table queries and processes information. Much like Cognos 11.1. visualizations, the data table issues a single query to fetch data. The browser then stores, filters and renders that data based on user input.

Data tables are another slam dunk addition to Cognos 11.1.6
Data tables do things that you’ve wanted since 2004…
  1. Expand/collapse for OLAP data sources
  2. Each column features interactive filter and search capability
  3. Color/size/image indicators for KPIs
  4. Scroll bar exists in the data table rather than for the page

All this flows from changes to how Cognos queries data and renders the data table. This is a preview of where reporting is headed, and you should expect this paradigm for many objects including prompts in the near future. For some reason most of this functionality defaults to ‘off.’ To enable it you must:

  • Set ‘Show column filters’ to ‘Yes’ in the data table properties
  • Click ‘enable expand and collapse’ in the grouping and summary popup found in the data table properties

I haven’t had time to play around with the data table as much as I’d like, so I cannot comment about specific formatting options that may be missing – I assume there are many. However even if it’s not as ‘pixel perfect’ as the list object, the data table is a killer addition to your authoring toolkit.

11.1 reporting visualization enhancements

Cognos 11.1 visualizations receive a number of enhancements in 11.1.6 to bring the authoring experience more in line with the interactivity available in dashboards.

  • Measure groups in report authoring (this was already available in dashboards)
  • Drill up and down (again, already available in dashboards)
  • Categorical map coloring

You can read more about these changes in Rachel Su’s helpful blog post here.

Cognos Analytics 11.1.6 Data Module Changes

Data modules didn’t receive new features in this release. Instead there are three major quality of life enhancement for authors related to interaction with data servers. Small changes like these save authors a ton of headaches. I’m glad IBM consistently improves product usability rather than just piling on new capabilities.

Add individual fields to data modules

The ability to add individual fields to a data module rather than entire tables is a game changer
Adding data to data modules just got so much easier

Data modules are easy to build but surprisingly frustrating to edit. A big part of the problem came from the fact that you could only add tables to data modules, not individual fields. As a result you sometimes had to re-add hundreds of fields then manually delete them just to get a single new field into your module. Consider this solved – you can now add individual fields directly to the tables within a data module.

See unused fields in data sources

Seeing unused fields makes it easy to adjust Cognos 11.1.6 to match the structure of your database

The ‘show unused items’ feature makes it easy to identify which fields in your data source are not currently in your data module.

Reload metadata schema from data modules

Reloading metadata directly from data modules saves a ton of clicks and much confusion
Reloading metadata is much easier for module authors now

The disconnect between data modules and data servers causes frustration for modelers for two reasons. First, it requires a ton of clicks to leave the module interface to make data server changes. Second, modelers frequently find themselves locked out of data server settings entirely and must ask administrators to make server changes. Cognos Analytics 11.1.6 alleviates part of this issue by giving authors the ability to reload metadata directly from the data modules UI.

Small changes are a big deal

These three changes are small but add up to a significantly improved modeling experience. It is now significantly easier to load metadata, identify new fields and add only those that you need to your model.

Cognos Analytics 11.1.6 AI & Advanced Analytics Enhancements

The AI assistant gets smarter with each release. Cognos Analytics 11.1.6 brings two significant enhancements to what is rapidly becoming the flashiest feature in the solution.

Automatic dashboard creation is a major strength of Cognos Analytics 11.1.6
This dashboard was created from a single command, filters and all

AI learning for visualizations

This subtle but very cool change makes dashboards and explore much better for end users. Cognos Analytics will now learn your visualization preferences whenever you click the ‘save’ button. It uses this information to suggest visualization types as per your preferences in the future. For example Cognos suggests a bar chart and you change it to a column, Cognos becomes more likely to suggest a column in the future. Right now this works on an individual level rather than system wide. You can control this with the AI>Learning capability

Automatic dashboard creation enhancements

Guided dashboard creation shows extremely well and is getting more useful with each release. The AI Assistant can now accept conditions alongside the ‘create dashboard’ command such as ‘Create dashboard for products by average profit in Florida’. This will generate a complete dashboard based on the criteria passed in, including the state = Florida filter as well as converting profit from a sum to an average. Very impressive stuff.

Stand alone calculations in advanced analytics

This enhancement is important for anyone using Framework Manager… so basically everyone. The AI assistant and explore capabilities in previous versions could not factor in standalone calculations. Framework Manager often requires a standalone calculation to aggregate correctly. The only way around this was to build a data set – which I still strongly urge you to do for a myriad of reason outlined in my article What are Cognos Analytics data sets. However as of 11.1.6 the AI Assistant and the Explore features now consider these important calculations when they do their magic.

See this interesting update from Jason Tavoularis to learn more about Cognos Analytics 11.1.6 AI advancements.

Support for R kernel in notebooks

Cognos Analytics 11.1.6 notebooks now support the R kernel in addition to python. IBM’s implementation of Jupyter is very good in so far as they wisely choose to implement standard Jupyter rather than some kind of Cognos themed reskin. Now you have another reason to check it out.

Don’t miss our Cognos Analytics 11.1.6 YouTube livestream

Want to see 11.1.6 live? PMsquare livestreams each Cognos Analytics release on YouTube. The Cognos Analytics 11.1.6 release stream goes live on 4/28/2020 at 3:00PM Eastern and will remain available afterwards so be sure to watch!

See Cognos Analytics 11.1.6 live

Keep reading to level up your Cognos game!

  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New
  • When To Use Cognos Data Sets
  • What are Cognos Data Modules?
  • What Are Cognos Data Sets?

When To Use Cognos Data Sets

April 14, 2020 by Ryan Dolley 8 Comments

I introduced you to Cognos Data Sets in Part 1 of this series and you recognize some intriguing possibilities. You want the massively improved performance, simple presentation for end users and quick road to Cognos modernization that Data Sets offer, but you’re not sure how to start. Well I’m here to help you understand when to use Cognos Data Sets – how to recognize each situation and how Data Sets help.


Prepare Data for Advanced Analytics

Preparing data for Cognos Analytics Explore is a great example of when to use Cognos Data Sets
Advanced Analytics require high quality data to function properly

Advanced Analytics features like forecasting in Cognos often work much better with Data Sets than other data source types. A narrowly focused, in-memory source dramatically enhances the speed, interactivity, accuracy and usefulness of features like Explore or the AI Assistant. This is especially true compared to giant Framework Manager models.

Recognizing poorly prepared data

The need to prepare data is most apparent when the advanced features of Cognos Analytics fail to provide meaningful suggestions or build garbage output. This manifests in the following ways:

  • The AI Assistant cannot understand which instance of ‘customer’ you want and picks it from an incorrect namespace
  • The AI Assistant makes very poor suggestions
  • AI generated visualizations do not filter properly because they contain different versions of the same data item – ‘customer’ from 3 different tables
  • The ‘generate dashboard’ command creates a nonsense dashboard
  • The forecasting feature does not appear in line, bar or column charts
  • Explore takes a very long time to load or interact with

Using Data Sets for advanced analytics

Data Sets make it easy to simplify the data used for advanced analytics. Because they are quick to make and perform very well I use them any time I want a great experience for my end users. The goal of using Data Sets for advanced analytics are:

  • Remove any duplicates in the data. Each field should occur only once
  • Identify a specific subject of analysis and include only measures and fields that help understand that subject. The explore feature helps immensely with this
  • Help the AI assistance shine by producing meaningful results
  • Improve performance across the board, especially in Explore

Improve Performance of Existing Models

Most long time Cognos customers have at least some models that perform slowly. Maybe it’s logic processing at run time. Maybe it’s the underlying database. Whatever the cause, you can’t let your end users watch a wheel spin for minutes on end whenever they make a slight change to a dashboard. Oftentimes customers solve this problem by locking Dashboards, Stories, Explore and anything else new and cool away from users. That’s a big mistake.

Recognizing poor performance in Cognos

This is fairly straightforward. You know performance is poor because Cognos is slow, right? Generally yes but there are some situations where poor performance manifests in surprising ways.

  • People call you and say ‘Cognos is slow’
  • You check Thrive and it tells you ‘Cognos is slow’
  • User adoption for self service features is lacking
  • Schedules are frequently late or are challenging to maintain
  • Source systems process dozens or hundreds of similar queries
  • You just keep staring at that damn spinning wheel

Improving Performance with Data Sets

This is an area where Data Sets shine because you’ve already got a model with all sorts of embedded business logic. It’s extremely easy to generate data sets as needed, and they automatically inherit all that Framework Manager logic. Very little data rework results in huge performance gains. Your goal is to:

  • Take advantage of in-memory processing and server RAM
  • Summarize detailed data to a higher grain to decrease row counts and better target analysis
  • Sort data by commonly filtered data items
  • Filter out unnecessary records
  • Decrease load on underlying data bases
  • Banish the spinning wheel forever

Imagine a query that processes for 15 minutes and runs 100 times a day. You are spending 1,500 minutes processing that data. By moving to a data set, the query runs once for 15 minutes to load the data. All subsequent executions load in ~1 second as data pulls from memory, not the database. You just saved 1,445 minutes of processor time. And you saved the sanity of your end users.

A real world performance example

My friend Rory Cornelius gave me the following quote about Data Sets. Rory actually did this with one of our clients. It shows how these techniques work to solve all sorts of Cognos problems.

Not your typical use case, but my client has this huge set of scheduled jobs. There was one job that had 10 reports each that queried almost the same data. Each report took about 45 minutes to run and they wanted them done sequentially to limit load. I pointed the reports to a Data Set instead, and they took 2 minutes to run instead. The Data Set still takes quite a while to load, but even with that, the total time was cut by at least 5 hours with significantly less load on the database.

Rory Cornelias, Senior Solution Architect with PMsquare

Combine Data Sources in Cognos

Throughout my career the number one impediment to analytics delivery is the struggle to combine data from multiple databases or applications. Data exists at different levels of detail with messy, mismatched keys and incompatible query languages. It’s just tough out there. However Data Sets radically streamline this process, especially for data already in Cognos. They provide a form of lightweight ETL and query processing to supplement for fully featured tools like Incorta or IBM ADP/Trifacta.

Data sets can be used to combine SSAS cubes with data warehouse tables.
A real world example of combining data sets from one of my clients

Recognizing data source mashup bottlenecks

Whether it’s a lack of clear requirements or an IT bottleneck for ETL, projects often wait for months or years at this stage. Faced with mounting delays, frustrated end users often choose to export data from Cognos and go it alone in Power BI. But you can learn to recognize the signs of data mashup bottlenecks

  • The data warehouse request backlog grows to many multiples of the Cognos backlog
  • End users export tons of data to excel
  • Advanced metrics are challenging to build because you are missing key calculation components
  • You often make model or data warehouse changes to add just a few columns or tables

Combining data with Cognos Data Sets

The process of combining data sources using Data Sets could hardly be easier as I outlined in Part 1 of this series. Instead the challenge lies in working through the logic of how best to combine two sources. The main things you will need to do are:

  • Identify the fields required for your analysis and locate them in your data sources
  • Create a data set for each source
  • Aggregate data at a compatible level of detail
  • Perform necessary data cleansing to make joins possible
  • Add filters, calculations or other logic at the Data Set level, not in Data Modules or Reports/Dashboards
  • Schedule data sets so that they build in the correct order
  • Combine them by joining together in a data module

This technique allows you dramatically simplify some complex ETL tasks with large and complex databases by first boiling each source down to just the fields you need. The key thing is to embed as much logic into the Data Set load process as possible. This minimizes query cost at run time and makes building and maintaining your Data Modules as easy as possible.

Simplify Presentation for Self Service

Framework Manager models typically exist for IT and accumulate years or even decades worth of developer focused design decisions. As a consequence they often require crucial yet undocumented context to generate accurate and timely queries, with a host of conditional flags, hidden filters and inscrutable calculations. Self service becomes impossible when end users don’t understand the structure or context of data. This is the number one objection I hear to rolling out Dashboards or Explore in Cognos Analytics

Recognizing overly complex models

An overly complex model stands between you and the evolution of your BI practice like an unbridgeable chasm. It’s calling card is the list of things you cannot accomplish because ‘the data is too complex.’ You know it by:

  • End users cannot effectively use the model, or you have locked them out of it due to data quality concerns
  • Self service feature roll out met with limited success due to data complications
  • Use of your data is always accompanied by caveats, ‘You have to include flags x,y,z to get meaningful results’
  • Debugging data problems is extremely confusing or time consuming
  • New hires to the BI team require weeks or months to get up to speed with the data

Preparing Data for Self Service

Data Sets are the bridge to this chasm. Because they are so easy to make and inherit all the logic from your Framework Manager source, you embed and effectively hide the underlying complexity with a well designed Data Set. You will need to build multiple Data Sets from the same model to effectively simplify the presentation – this is a factor of your design. Remember, the goal for Data Sets is to break a ‘one size fits none’ model into smaller, usable components. Let your data sets multiply!

  • Break your large model into smaller, digestible subject areas based around the types of questions your users need to answer
  • Build a Data Set for each subject area
  • Don’t be shy about overlapping data in multiple Data Sets. The end goal is to make something easy to use for an individual subject area
  • Don’t be shy about building lots of Data Sets
  • Remember – the Data Set inherits your Framework Manager logic. You should have a high degree of data consistency across Data Sets as a result
  • Always be willing to alter, change, abandon and create new data sets based on evolving user needs.

Your instincts from Framework Manager probably tell you to come up with a grand, cross – Data Set design to ensure consistency and eliminate re-use of fields. Don’t do this. Remember, tailor each Data Set to the needs of its user community and be willing to adapt as those needs change. This is the key to modernizing Cognos to compete with Tableau or Power BI

Modernize Your Cognos Practice

By following all these steps you will modernize much of your Cognos Analytics practice without intentionally doing so. A modern BI practice requires two modes of operation, often called ‘Mode 1’ and ‘Mode 2’. Mode 1 is the traditional enterprise BI way of doing things; ETL, ODS, EDW, monolithic Framework Manager models, IT authored reports. It remains a vital component of our work. However Mode 2 is equally important; Agile data mashup, in-memory processing, collaboration with self-service users and above all, speed.

The techniques outlined above will get you to mode 2 rapidly, even if it seems daunting or impossible today. Because you’ve done so much great work building your Framework Manager models you have an incredible foundation for self service – you just haven’t realized it yet. Using Data Sets in combination with Data Modules and Dashboards will give you the performance, simple data presentation and agility you need. Try it! And as always if you need some help along the way reach out to me and PMsquare. The answer to ‘When to use Cognos Data Sets’ is ‘Now!’


  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New
  • When To Use Cognos Data Sets
  • What are Cognos Data Modules?
  • What Are Cognos Data Sets?

What are Cognos Data Modules?

April 8, 2020 by Ryan Dolley 18 Comments

Cognos Data Modules are a web-based data acquisition, blending and modeling feature available in Cognos Analytics. They first hit the scene as part of Cognos 11 and are meant to supplement and eventually replace Framework Manager for both self-service and IT data modeling needs. I’ll pause for a second to let you long-time Cognoids hyperventilate a little… is everyone back? Good. Through this and subsequent posts I’ll try to dispel misconceptions about this awesome feature of Cognos while making you comfortable and – dare I say – excited to use them.

Cognos data module in action
Data modules – the wave of the future

Data Module Features

Imagine a data modeling solution that has the following features:

  • Easy to install and manage
  • Join dozens or hundreds of tables across multiple databases
  • Execute cross-grain fact queries
  • Build simple or complex calculations and filters
  • Build alias, view, union and join virtual tables
  • Secure tables by groups, roles and data elements
  • Create OLAP-like dimensional hierarchies
  • Enterprise governance, auditability and security

 ‘Okay easy, I’m imagining Framework Manager’ you’re thinking right now. Yes! But, add in:

  • Natural-language and AI powered auto-modeling
  • Automatic join detection
  • Easy integration of excel data
  • Automatic extraction of year, month, day from date data types
  • Automatic creation of relative time filters (YTD, MTD, etc..) and measures (YTD Actuals, MTC Actuals, etc…)
  • In-memory materialized views (data sets)
  • In-memory query cache
  • Direct access to members for relational sources!

‘Well that’s not Framework Manager… it must be Tableau, right!?’ No, in fact Tableau doesn’t offer even half of these capabilities. This is what every Cognos Analytics customer gets out-of-the-box in data modules today, with more features being added all the time.

Who are Data Modules for?

Many of my longtime customers have the misconception that data modules are for ‘end users’ only and that real data modeling can only be accomplished in Framework Manager. Conversely my new customers have built entire BI practices while having no idea what Framework Manager is. Clearly something is out of sync here, so let me make it very clear: Who are data modules for? If you’re reading this, the answer is you.

The Business User

The line between ‘end users’ and the BI team has gotten fuzzy in the last few years as increasingly complex models are built by people outside the IT department. Data modules are ideal for someone who wants to quickly and easily combine enterprise data with departmental data or excel spreadsheets and cannot wait for IT to build an FM package or SSAS cube. The interface is clean and easy to use and the ease of creating custom groups and building relative time calcs makes data modules an ideal place to combine data – even easier than Excel in many cases. As an added bonus, it’s very simple for the IT team to take a ‘self-service’ data module and incorporate into enterprise reporting without significant development work.

The Cognos Pro

Many Cognos pros kicked the tires in 2016 and could only see the yawning chasm of functionality that separated data modules from Framework Manager, myself included – for years I encouraged my clients to consider them for niche applications but to rely on FM for anything important or difficult. No longer! As of the 11.1 release, data modules have reached feature parity with Framework Manager is almost all respects and even surpassed FM in important modeling automation tasks like relative time automation. It is no longer the obvious choice to default to Framework Manager for new Cognos development.

Data Modules vs Framework Manager

Given the enhancements to data modules, which should you choose? As of the 11.1 release my recommendation is to do all new development in data modules for the following reasons:

  • Significantly easier and faster to create
  • Great features like relative time, date column splitting, grouping
  • Target of all future development
  • Unlock modern BI workflow

These points are explored in detail here – for now I’ll leave you with a final thought. My new clients use the same ol’ Cognos to deliver with the speed and scale you’d expect from Tableau or Power BI – my friend Vijay can tell you all about it. The key differentiation between them and legacy Cognos installations with orders of magnitude more resources is the embrace of data modules and the iterative, build-it-in-prod approach to BI delivery that data modules enable.


  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New
  • When To Use Cognos Data Sets
  • What are Cognos Data Modules?
  • What Are Cognos Data Sets?

What Are Cognos Data Sets?

April 7, 2020 by Ryan Dolley 7 Comments

I’ve explored Data Modules in depth on this blog over the last year with the hope of showing you how awesome data modeling in Cognos Analytics can be if you really embrace it. There is, however, an additional piece of the Cognos data puzzle that you need to understand to unlock the full potential of the platform – the Data Set. So let’s answer the question – just what are Cognos Data Sets?


The IBM Blueview Data Set Series

What are Cognos Data Sets?
When to use Cognos Data Sets


What is a Data Set in Cognos?

The Cognos Data Set screen is easy to understand and use.
Data Sets offer an in-memory data processing option for Cognos Analytics

Simply put, a Data Set is data source type in Cognos Analytics that contains data extracted from one or more sources and stored within the Cognos system itself as an Apache parquet file. The parquet file is then loaded into application server memory at run-time on an as-needed basis. This (usually) greatly enhances interactive performance for end users while reducing load on source databases. When combined with Data Modules, Data Sets offer incredible out-of-the-box capabilities like automatic relative time, easy data prep and custom table creation.

Data Sets are also extremely easy to build from your existing Framework Manager or Transformer packages making them an excellent option for getting the most out of your legacy Cognos 10 models. In fact this is probably the #1 use case for the Data Set technology and is the absolute fastest way to modernize your environment and turn Cognos into a rapid-fire data prep and visualization machine.

I’m going to write a full blog post about the exact situations that suggest a Data Set solution, but in short you should consider using Data Sets whenever:

  • Excellent interactive performance is a critical part of your deliverable
  • You wish to limit extremely costly SQL queries by re-using results
  • You must join multiple data sources together or accomplish other ETL tasks within Cognos rather than source systems
  • Existing Framework Manager or Transformer models are too complex or too slow for self-service
  • Someone tells you Cognos is slow but Tableau or Power BI are fast (those tools use Data Set-like technologies to enhance interactive performance)
  • You just want to do something really cool

Which Features can use a Data Set

There is one small limitation to Data Sets – while they function as a data source for all Cognos Analytics features they cannot be used directly to author reports. The solution to this is simple – wrap them in a Data Module and import the Data Module to Report Authoring. You should be doing this anyway for all Data Sets as it provides maximum deployment flexibility and ease of upkeep. I will cover best practice topics like this in a future article.

How to Build a Data Set

The 'create data set' capability is pretty well hidden in Cognos Analytics
The ‘Create data set’ capability is hidden among model options

Building a Data Set is simple, especially if you have existing Framework Manager or Transformer models available in Cognos. In fact Data Sets can only be built on top of existing models or Data Modules- not directly on data servers. IBM has helpfully hidden the ‘Create data set’ capability in the ‘more’ menu of model objects in the environment, so it’s surprisingly easy to miss.

Cognos Data Set Creation

Creating a Data Set is a straightforward process, especially for experienced Cognoids. The UI is actually a re-skinned version of Report Authoring and many of your favorite tricks will work here. Building a Data Set is as simple as dragging columns into the list object, saving and loading data. Of course there are additional options you can take advantage of.

The Cognos Analytics Data Set creation screen shares many features with the Report Authoring interface
  1. Source View: Browse the tables and fields in your data source exactly as you would in Report Authoring
  2. Data List: The data table shows a live view of the Data Set as you build it. It queries new data as you make changes
  3. On Demand Toolbar: The on demand toolbar appears when you click on a column, giving you the ability to filter and sort.
    1. Filtering: Filters help you focus the data in your Data Set to just what you need. Fewer rows = better performance.
    2. Sorting: Sorting by the columns most used in report or dashboard filters (for example, time data) can greatly improve performance
  4. Query Item Definition: The query item appears when you double click a column header. You have access to query item functionality from Report Authoring, which means you can really accomplish a lot from this popup.
  5. Preview: Unchecking the preview button switches the data table into preview mode which turns off automatic data query as you make adjustments to your Data Set.
  6. Summarize and Row Suppression: The summarize function rolls your data up to the highest level of granularity, for example rolling daily data up to the month. Row suppression is honestly a mystery to me Special thanks to Jason Tavoularis at IBM for an explanation – row suppression in data sets only applies to dimensional data sources and does the same thing as using row suppression in Report Authoring.

Once you’ve imported your desired data, set your filters, sorts and summaries and maybe added a few calculations for good measure it’s time to save, load and deploy your Data Set.

Saving and Loading a Data Set

Data Set save options include Save, Save As and Save and Load Data
Data Sets must be saved and loaded to be available

When you save a Data Set you will see the option to ‘Save and load data.’ This will allow you to select a directory in Cognos to house the Data Set object. It will also issue one or more queries to retrieve data and populate a parquet file. This file is stored in Cognos and loaded into memory upon request when users access the Data Set. Check out the ‘Flint’ section of this in depth article to understand what happens under the hood during Data Set creation and Query

Scheduling and Managing Data Sets

Data Sets only contain data from their last load; it is good practice to get in the habit of scheduling and monitoring Data Sets to ensure they contain relevant data and continue to perform well.

Data Set Scheduling Options

Data Sets and Reports have all the same scheduling options
Data Sets have the same scheduling options as reports

The easiest way to schedule Data Sets is via the ‘schedules’ tab in Data Set properties. Data Sets and Reports share all the same scheduling options, including the ‘by trigger’ option. Scheduling via a trigger makes it easy to ensure Data Sets only load after your ETLs complete. This works great for simple or one-off scheduling tasks.

For more complex schedules, Data Sets are available in the Job feature. Again, they function as if they were reports as far as building Jobs is concerned.

Data Set Management

The Advanced Properties view contains the statistics you need to manage Data Set performance.
Manage Data Sets using their advanced properties

The Data Set properties screen contains the info you need to effectively maintain fresh and performant data for your end users. At the top of window you can see the last load date of the Data Set, while expanding the ‘advanced’ exposes the following:

  • Size: The compressed size of the parquet file on disk
  • Number of rows: The number of rows in your data set. Keep this under ~8 million for best performance
  • Number of columns: The number of columns in your data set. No hard limit here, just don’t include columns you don’t need
  • Time to refresh: The time it takes for the Data Set to load
  • Refreshed by: The name of the person who last refreshed the data set

I will write a longer post about Data Set tuning and troubleshooting. For now it’s key to keep in mind the row and column suggestions above. And while ‘Time to refresh’ is important, this represents the time it takes to load data and has no impact on the performance end users will experience. The beauty of Cognos Data Sets is that by front-loading the processing, you can create a complex result set that takes hours to load but offers sub-second response time to end users.

A Real World Example of Data Sets in Action

I have used Data Sets in many successful client engagements to greatly improve performance, simplify presentation or accomplish ETL tasks in an afternoon that their DW team had put off for years. Here is a simple example for you.

The Problem: Metrics, metrics everywhere!

This customer came to us with a very, very common problem. The sales support team had identified a need for some new advanced metrics and built out a prototype dashboard. However, the underlying data divided between two Microsoft SSAS cubes and a handful of tables in the EDW. The data warehouse had given an estimate of many months to create the necessary tables and cubes.

The Solution: Cognos Analytics Data Sets

The customer brought in PMsquare on a 40 hour contract to make this happen. If your initial reaction to that contract length is skepticism I don’t blame you. In Cognos 10 this would have been impossible. However thanks to Data Sets I was able to do the following:

  • Extract the needed data from each SSAS cube and the EDW into a Data Set. There were 3 Data Sets total, one from each data source.
  • Join the Data Sets together into a Data Module and add in all the Data Module goodies like relative time
  • Create a new, final polished Data Set from that Data Module to simplify presentation and improve performance
  • Build out the customer’s dashboard

The customer was extremely satisfied with the end result, which looked something like this:

A real world data flow from a project I successfully completed.
A cavalcade of awesomeness awaits you with Data Sets

Cognos Analytics Data Sets in Summary

As you can see, I really was able to accomplish months of work in a single week using Data Sets. Obviously this technology cannot replace all ETL tasks however Cognos Analytics is now an option for low to medium complexity transformations. And you now have a slam-dunk option for rapidly simplifying presentation or improving performance vs even the simplest database view.

Be sure to check back next Tuesday, 4/14/2020 for part two of this series: When To Use A Data Set!


Catch up on all things Cognos:

  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New
  • When To Use Cognos Data Sets
  • What are Cognos Data Modules?
  • What Are Cognos Data Sets?

The Gartner Magic Quadrant is Worthless: Cognos Edition

February 19, 2020 by Ryan Dolley 12 Comments

Another February brings another edition of everyone’s favorite yearly head scratcher – the Gartner Magic Quadrant for Business Intelligence! I have expressed some strong opinions on the worth of the Magic Quadrant as a tool for decision makers in the past and this year will be no different. As always I strongly urge you to read the report rather than just rely on the picture as (once again) vendor positioning on the scatter plot feels extremely disconnected from the analysis contained below it. So let’s take a look at the 2020 Magic Quadrant as it relates to Cognos Analytics.

The 2020 Gartner magic quadrant for BI  is much harsher to Cognos than their written analysis
IBM’s position and Gartner’s written analysis are out of sync

A big change to the Magic Quadrant this year is the return of enterprise reporting as key differentiator for what they are now calling ‘ABI platforms.’ The second differentiator is ‘augmented analytics’, which is integrated ML and AI assisted data prep and insight generation. Gartner is now calling visualization capabilities a commodity. What’s old is new again.

The return of enterprise reporting

This should be great news for Cognos Analytics! Cognos is a recognized leader in enterprise reporting. In fact Cognos’ reliance on enterprise reporting was the raison d’etre for knocking it out of the leaders quadrant to begin with.

It’s extremely curious, then, that IBM’s positioning on this quadrant was not more markedly improved. It’s even more curious that Gartner writes of enterprise reporting, ‘At present, these needs are commonly met by older BI products from vendors like…IBM (Cognos, pre-version 11)’. It’s almost as if Gartner is unaware that Cognos 11 meets the same enterprise reporting needs as previous versions. At the very least they seem unwilling to give IBM credit for it on the chart. The write-up tells a different story.

Augmented analytics gain steam

The second differentiator on the Magic Quadrant is also good news for Cognos. The platform’s augmented analytics capabilities have seen tremendous investment in the 11.1 release stream and are legitimately ahead of most vendors I have hands on experience with (Power BI, Tableau, Domo, Incorta being the primary ones.) Observe:

  • Automated ML driven forecasts
  • Chatbot for NLQ and visualization creation
  • An entire AI driven augmented analytics interface
  • AI driven data prep
  • Integrated jupyter notebooks that write to and read from Cognos data

That’s a lot. If you want a comprehensive set of powerful, modern augmented analytics capabilities Cognos is a great choice.

The fact is that Cognos’ strength lines up perfectly with Gartner’s 2020 market differentiators, while it’s only ‘weakness’ – self-service visualization – is now considered a commodity. Again, why are they so poorly represented in the MQ image, and does the actual analysis tell a different story?

What does Gartner say about Cognos Analytics?

This write up is a lot rosier for IBM than the dismal MQ image suggests. I’ve summarized Gartner’s written analysis of IBM for you below:

Strengths

  • Cognos is one of the few offerings that offers all critical capabilities and differentiators in a single platform
  • IBM’s roadmap includes AI driven data prep, social media analytics and a long term goal of unifying self-service, enterprise reporting and planning (think Planning Analytics) in a single platform
  • Cognos can be deployed on-prem or in any cloud, unlike many other vendors

These significant strengths seem totally disconnected from where they have IBM placed on the quadrant. If enterprise reporting and augmented analytics are key differentiators between ABI platforms and Cognos is one of the only offerings that has it in a unified platform, how are they not better represented on the completeness of vision axis? Baffling!

Cautions

  • It is not often the sole enterprise standard
  • We think it costs more than other vendors
  • People don’t call us as much as they used to about Cognos

That last point is the key to unlocking the reality of how the Gartner Magic Quadrant for business intelligence really works. Let’s see why.

Gartner is a self-driven feedback loop

A huge component of ranking on the Gartner Magic Quadrant for Business Intelligence is straight up how often prospective customers call them about various tools. They don’t call asking about Cognos very much, ergo Cognos has a poor ranking. Don’t believe me? Look at my analysis of their MQ for planning platforms to see how survey scores seemed to have no impact on their ranking of Oracle as the market leader – Oracle’s survey scores were horrible!

Ask yourself, would you call Gartner to discuss a BI tool they rank in the bottom third of vendors? You wouldn’t. You call Gartner to talk about Microsoft, Tableau, Qlik and (bafflingly) Thoughtspot. Otherwise you call someone else. As long as this remains a major criteria for ranking Gartner will remain a market distorting self-feedback mechanism.

By this same logic, Cognos is the world’s #1 business intelligence tool in the Ryan Dolley Magic Quadrant as it represents 90% of my calls!

Why the 2020 Magic Quadrant should make you feel good about IBM Cognos Analytics

The BI market is shifting once again. Visualization is a commodity, enterprise reporting is king and augmented analytics is on the rise. As I’ve outlined above, IBM Cognos Analytics’ feature set is extremely well positioned to thrive in the landscape Gartner describes, whether or not they recognize it. There simply is no platform that offers the total package of mode 1, mode 2 and augmented analytics like Cognos.

Want to further the conversation? Connect with me on LinkedIn and check out PMsquare’s website for help getting the most out of Cognos.


Read on to learn how to modernize Cognos and become your own leaders quadrant!

  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New
  • When To Use Cognos Data Sets
  • What are Cognos Data Modules?
  • What Are Cognos Data Sets?
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Interim pages omitted …
  • Go to page 6
  • Go to Next Page »

Copyright © 2021 · Atmosphere Pro on Genesis Framework · WordPress · Log in