• Skip to main content

IBM Blueview

Cognos Analytics and all things IBM

  • The Blog
  • Cognos Glossary
  • Cognos Resources
  • About Me
  • Categories
    • Cognos
      • Data Modules
      • Administration
      • Framework Manager
      • Dashboards
    • Opinion
    • Community Spotlight
  • PMsquare
  • Subscribe

Uncategorized

Cognos Relative Dates in 11.2

July 15, 2021 by Ryan Dolley 7 Comments

Relative Dates are a powerful feature of Cognos Analytics data modules that enable quick and flexible comparisons between time periods. For example, comparing revenue for the current year version the previous year, or the current week of the month vs the same week last month. These comparisons are popular across all lines of business but especially in the office of finance, which relies on period over period analytics as a bedrock of analysis. The video below covers how to set up and use relative dates in Cognos Analytics.

Live demonstration of relative date functionality in Cognos Analytics.

Relative dates are a feature exclusive to data modules. If you want to use relative dates in Framework Manager, your best bet is to extract the necessary fields into a data set. You can then import the data set into a data module and set up relative dates there. In all likelihood you’ll also get better query performance because data sets are great!

Setting up relative dates

There are two components to the relative date functionality in Cognos Analytics, Relative date filters and relative date measures. They are both easy to configure and use. To set them up you need the following:

  • The ability to use data modules
  • Access to the ‘Calendars’ folder that ships with Cognos
  • Data that includes a ‘date’ or ‘date/time’ data type column
  • The ability to use the linked module feature of Cognos

Your administrator can turn on the relevant permissions. Once that’s done, it’s simply a matter of following the steps in the video above, but I’ll summarize them here:

  1. Open the data module that requires relative dates
  2. Slick the ‘add source’ button
  3. Navigate to and select the calendar you want. Cognos ships with three by default, Fiscal, Gregorian and 5-4-5. You can create custom calendars as well.
  4. Your data module will now contain a new linked calendar table. This table is easy to recognize as it will be slightly transparent and turquoise.
  5. Now select any date or date/time column in your data and open the properties. Select your calendar table under the ‘lookup reference’ property.
  6. Congratulations, you have configured your relative date filters!
  7. Now select any measure column in your data and open the properties. Select the date you just configured in step 5 under the ‘lookup reference’ property.
  8. Congratulations again, you have configured your relative date measures!
  9. Click on the arrow next to either column to see the filters or measures you created.

Relative date filters

Relative date filters provide drag and drop capability to instantly filter any visualization, table or crosstab to the selected period. By default there are 23 relative date filters, but you can create more by editing the calendar files that ship with Cognos.

A list of all relative date filters in Cognos Analytics
Create all these relative date filters with just a few clicks. Amazing!

Relative date measures

Relative date measures are similar to relative date filters but represent the measure value on a visualization rather than a filter. Think of it as the line on a line chart. There are 23 relative date measures by default but by all means, customize.

A list of all relative date measures in Cognos Analytics
Relative date measures galore!

Relative dates in action

If you’re having a difficult time picturing the final product of relative dates, here’s a simple chart to make it easy to understand how useful they are:

Current year vs previous year is a common relative date comparison that you'll build using the Cognos relative date function
Relative dates in action on a line chart

Above you can see a common but powerful analysis. On the X axis we have the months of the year, and on the Y revenue totals. Using relative date measures I can easily plot two lines, one for current year and one for previous year for comparison. I am able to do this with zero coding or SQL. As a bonus, these relative periods will always be up to date – when the year rolls over into 2022, the value of Prior Year automatically updates to 2021.

As-of-date analysis

By default Cognos Analytics assumes the current date as the point of reference for relative date analysis. Therefore, the final piece of the relative date puzzle is enabling as-of-date analysis. This global parameter allows users to select the ‘as of date’ for the relative time feature. It’s easiest to think of this by asking yourself, relative to what date am I counting ‘year to date’ or ‘month to date’. Is it today? Last Friday? December 19th, 1982?

The as-of-date global parameter allows users to easily select the target date for relative date analysis
The as-of-date selector in Cognos Analytics 11.2

Configuring the as-of-date global parameter is a task for administrators and is very easy:

  1. Open the manage menu and select ‘Customization’
  2. Select the ‘Parameters’ tab
  3. Click ‘New’
  4. Type ‘_as_of_date’
  5. Select the ‘More’ menu (the ellipsis that appear when you hover over the parameter) and click ‘Properties’
  6. Click ‘Applied to all roles’ if you want everyone to have this capability
  7. Log out of Cognos
  8. Log back in to see the global parameters button appear in the upper-right corner of the UI

Cognos relative dates in summary

There you have it. Reading back over this blog post there are a lot of steps included, but don’t get intimidated. This is remarkably easy compared to writing relative date calculations or using SQL to do it. The video above walks through every single step included in this blog post so watch and follow along. And don’t forget to like and subscribe to the Super Data Brothers channel!

IBM documentation references

Relative date documentation home
Sample calendars
Creating a data module for relative date analysis
Creating relative date filters
Customizing the reference date


Keep reading for more awesome Cognos content!

  • Cognos Union Queries in Reports
  • Cognos Relative Dates in 11.2
  • The 2021 Gartner BI Magic Quadrant is Broken for Cognos Analytics
  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New

The 2021 Gartner BI Magic Quadrant is Broken for Cognos Analytics

February 19, 2021 by Ryan Dolley 9 Comments

It’s that time of year again – time for the Gartner BI Magic Quadrant. Like a bad ex who just won’t back off, Gartner drops their annual market warping report around Valentine’s Day to ruin the mood. I’ve expressed strong opinions in the past about Gartner’s report and methodology and this year will be no different. Put simply, they did Cognos Analytics dirty again while rewarding some competitors based on evaluation criteria that have nothing to do with the capabilities or quality of the software. In particular I argue that:

  • Gartner heavily weights the number of inquires each tool generates for Gartner’s advisory services in determining the rankings. This is a self-reinforcing feedback loop.
  • Gartner is rewarding BI vendors that have attachments to ERPs, office suites and other enterprise software. Particularly SAP and Oracle.
  • Gartner’s write up of IBM doesn’t align with their ranking on the scatterplot, particularly in completeness of vision.
  • Gartner’s Peer Insights customer review site tells a completely different story about Cognos Analytics.
Gartner badly undervalues Cognos Analytics in the 2021 magic quadrant
The Gartner BI Magic Quadrant undervalues Cognos Analytics again

You may be thinking, ‘Ryan of course you feel this way – you’re the most public Cognos fanboy on Earth!’ Guilty as charged. But my issue isn’t with the placement of Microsoft or Tableau in the Leader’s Quadrant. Those tools are crushing it and deserve their placement. What I take issue with is IBM’s location in regard to legacy competitors like SAP or Oracle and the overall parameters that determine the ranking.

How does Gartner evaluate software?

Contrary to popular belief, Gartner is not conducting an in depth evaluation of what they call ‘ABI platforms’ to identify and compare the performance, features and ease of use of the software. Instead, Gartner typically relies on the number of inquiries each tool generates for their own consultancy in addition to surveys and interviews to determine this ranking. The natural outcome is a feedback loop where being a leader last year is the primary criteria for being a leader this year. You simply aren’t going to call Gartner to talk about IBM because Gartner is telling you not to bother.

This year they also seem to be penalizing enterprise tools that don’t come bundled with ERPs or other enterprise software. Unfortunately those tools are typically very poor BI tools – I would know, I’ve worked for them.

Cognos Analytics as a case study

This manifests in a yawning gap between IBM’s scatterplot coordinates and what they actually wrote about Cognos Analytics. Unfortunately the image is what carries all the weight in this report. Comparatively few people will read IBM’s entry, so I’ll summarize it for you below:

IBM Cognos Analytics Strengths

  • One of the only tools that offers comprehensive enterprise reporting and self-service. Gartner calls these ‘mode 1 and mode 2’
  • Strong product vision combining AI augmented analytics, traditional BI and planning capabilities
  • Deployment flexibility on prem and on all cloud platforms

IBM Cognos Analytics Weaknesses

  • People don’t call Gartner asking about Cognos
  • It doesn’t have ‘adoption drivers’ like associations with popular ERPs or office suites
  • It costs about the same as standalone BI tools but more than BI tools that are bundled with ERPs or office suites

Very curious. All of Cognos’ strengths are things BI developers, administrators and users care about. And all of those weaknesses are things that procurement people, industry analysts and – most importantly – Gartner themselves care about. It’s not that I think these are unimportant points, but it seems that software features and quality carry significantly less weight.

Where does Cognos belong?

I’m going to be as objective about this as possible and explain my reasoning, but first take a gander at the world premier of the Ryan Dolley BI Magic Quadrant!

A more realistic assessment values IBM for a great vision while recognizing execution challenges.
The 2021 Ryan Dolley BI Magic Quadrant is available free of charge!

Cognos Analytics completeness of vision

Gartner really got completeness of vision wrong. This is where Cognos Analytics shines. No other BI platform on Earth combines the breadth of present day capabilities with the visionary roadmap of IBM. Gartner sort of acknowledges this in their write up but it’s not at all reflected in the MQ image so let me break it down for you:

Cognos Analytics today

  • Integrated AI Assistant chatbot for NLQ capabilities and auto-visualization
  • ML-driven one-button forecasting
  • Correlation and causation engine suggests relationships in data
  • Jupyter notebooks for easy BI/ML integration
  • Absolute top enterprise reporting capability in the world
  • Huge scale bursting, schedule and event-driven data distribution
  • Robust extensions allow incorporation of custom visualization libraries, js code. You can make Cognos do damn near anything. Just ask Paul.
  • Easy to build self-service dashboards and visualizations
  • Easy self-service data modeling
  • Self-service storytelling/narrative BI
  • Unlimited scale
  • Deployable on prem and on any cloud

It can’t be stressed enough that none of the current ‘leaders’ offer such a wide spectrum of features. In fact they offer only bare bones enterprise reporting if they offer it at all.

Cognos Analytics roadmap

The Cognos Analytics roadmap is very strong. There’s a lot I don’t know and a lot I can’t say, but to give you some idea of where things are going:

  • Easy what-if scenario modeling and data science for end users
  • AI-driven data prep and data quality evaluation
  • Deeper integration with Planning Analytics to provide a single portal for enterprise BI and planning
  • Dramatically improved NLQ including ontological customization
  • Continued containerization and modernization of the platform

Cognos Analytics ability to execute

Let’s be honest. There are some real concerns with how existing Cognos Analytics customers are executing with the platform. I know this because I hear your struggles. And some of that comes from stability issues and UI/UX quirks with the software that IBM absolutely needs to address. The good news is that IBM knows this.

IBM's ability to execute score is a reflection of outdated practices, not bad software.
Note to the Cognos community – please start executing!

But the reality is that many long term Cognos customers are simply not able to realize the full capabilities and value of the platform and in many cases I think it’s their own fault. The culture and practice that grew up around Cognos was formed in the late 90s and 2000s and it shows. Features that get extensive and successful use in Power BI and Tableau are left virtually untouched by a majority of Cognos customers. Data modules and data sets have improved massively since their less-than-stellar debuts. Their usage in existing Cognos deployments has yet to catch up with just how darn good they are and it’s a shame. Consider this my plea to you to start delivering with Cognos the same way people deliver with Tableau or Power BI. The platform can support it, the technology is there. What needs to change is our collective ambition as a community of practice.

This is not to say that nobody is executing at a high level. There are some people who are using all of Cognos’ capabilities to the fullest and having huge success. And new customers typically do great. But until us old timers collectively start to take advantage of what this platform has to offer I can’t really fault Gartner’s analysis for ability to execute. It’s off, but only a little.

Cognos Analytics and Gartner Peer Insights

Cognos ranks much higher in Gartner's peer insights platform
Gartner Peer Insights tells a very different story about Cognos Analytics

Gartner has a lesser known product called Peer Insights and it tells a very different story. Peer Insights features product reviews by verified users rather than the opinion of analysts. Take a second to check out the Business Intelligence page and sort by average review. Scroll past the minor players with a single glowing review and look at the placement of the platforms included in the Magic Quadrant. As of this writing Cognos Analytics scores worse than Tableau Desktop. It scores better than Microstrategy, ThoughtSpot, Tableau Server, Domo, Power BI, Qlik Sense, Oracle, Sisense, Looker, Amazon, SAP and Board International.

What to make of all this?

I hope I’ve done a good job of walking you through why I strongly disagree with IBM Cognos Analytics’ placement on the 2021 Gartner Business Intelligence Magic quadrant. I’ve been as objective as I can in this analysis and I don’t think Gartner is wrong across the board. I really do like Power BI and Tableau. I’m increasingly impressed with Domo and very intrigued to see what AWS does going forward. But this report badly misses the mark on Cognos and it does so by disregarding what a strong enterprise BI platform it is in 2021 and penalizing IBM for the lack of market awareness that Gartner themselves have caused.


Want to continue the conversation? Connect with me on LinkedIn and check out the PMsquare YouTube channel. And don’t miss the great Cognos content below!

  • Cognos Union Queries in Reports
  • Cognos Relative Dates in 11.2
  • The 2021 Gartner BI Magic Quadrant is Broken for Cognos Analytics
  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New

Data Modeling for Success: BACon 2020

October 15, 2020 by Ryan Dolley Leave a Comment

Getting Started

Logging in and creating a folder for your work

  1. Because of high interest in today’s session we are using a single 96 core 768 GB RAM Cognos server with anonymous access allowed. This makes logistics much easier but means we don’t have access to the ‘My content’ feature of Cognos.
  2. Click here to access Cognos: http://3.216.29.72:9300/
  3. Click ‘Team content’ and navigate to the ‘BACon Users’ folder.
  4. Click the ‘+’ button in the navigation window and select ‘Folder’.
  5. Give your folder a name with the following format: Initials – favorite movie – favorite color. In my case it would be ‘RPD-Solyaris-Green.’ Hopefully this ensures a unique folder for everyone.
  6. We are now ready to begin today’s class.

Organizing for Self Service with Data Sets

In our first example we will prepare a data set for self-service. This data set is sourced from an existing Framework Manager package and is a good example of a technique PMsquare often uses to simplify presentation and improve performance for end users. We will cover the basics of creating a data set, optimizing its performance and using it in a dashboard.

Creating a data set off a relational package

  1. Click ‘Team content’ on the left side of the screen and Navigate to Team Content>BACon Modules.
  2. Hover over ‘Go data warehouse (query) package and click the ‘More’ button.
  3. Select ‘Create data set’ from the ‘More’ menu. The data set screen will launch.
  4. The data set screen might look familiar – it is a stripped down version of Cognos Analytics report authoring. You should see a ‘Source’ section with the ‘Go data warehouse (query)’ package loaded, and an area that reads ‘Add data here’.
  5. Expand the ‘Sales Target (query)’ folder under ‘Insertable Objects’.
  6. Expand the ‘Sales Target (query)’ namespace located within.
  7. From the namespace, drag the following fields to the ‘Add data here’ section of the screen. A list will appear and populate with data from the package:
  8. Retailer.Region
  9. Retailer.Retailer country
  10. Retailer.Retailer name
  11. Retailer.Retailer code
  12. Employee by region.Employee name
  13. Employee by region.Position name
  14. Time.year
  15. Time.Month key
  16. Time.month
  17. Time.month (numeric)
  18. Sales target fact.Sales target
  19. Click the ‘Page design’ drop down and select ‘Page preview’.
  20. Turn off  ‘Summarize detailed values…’ and observe the effect of summarization on the data displayed in the list. The underlying data is at the level of individual Target transactions.
  21. Turn detail summarization back on.
  22. Click the ‘Year’ column and select ‘edit layout sorting’ from the sorting drop down menu in the toolbar.
  23. Drag ‘Year’ and ‘Month (numeric) to the ‘Detail Sort List’ and click OK. We have sorted our data set by year and month. Sorting on commonly used columns can improve performance.
  24. Click ‘Position name’ and select ‘Create custom filter’ from the filter drop down menu in the toolbar.
  25. Click ‘Condition’ to access the conditional filter screen
  26. In the ‘Input condition’ next to ‘Contains’, type ‘Sales’. This will filter the Position Type column for values that contain the word ‘Sales’. Removing unnecessary rows or columns can improve performance.
  27. Click the ‘Save’ icon and select ‘Save as’.
  28. In the ‘Save as’ window, navigate to ‘Team Content > BACon user>your folder. This is the folder you created in step 5 of Getting Started.
  29. In the ‘Save as:’ text box, type ‘Sales Target Data Set’.
  30. Click ‘Save’. The ‘Save as’ window will close.

Finding and loading a data set

  1. Click the ‘Home’ icon to return to the Cognos Analytics welcome screen.
  2. Click ‘Team content’ on the left side of the screen and navigate to ‘Team Content>BACon users>your folder’.
  3. Hover over ‘Sales Target Data Set’ and click the ‘More’ button.
  4. Select ‘Reload’. The Data Set loading notification will appear at the top of the screen. It should load and disappear fairly quickly.
  5. In the ‘More’ menu, select ‘Properties’.
  6. In the ‘Properties’ menu, notice the created, modified and data refreshed dates & times in the upper right.
  7. In the ‘Properties’ menu, expand ‘Advanced’ and scroll down to see statistics about this data set:
  8. Size
  9. Number of rows
  10. Number of columns
  11. Time to refresh
  12. Refreshed by
  13. Close the properties menu.

Dashboard Creation

  1. Click the ‘More’ button next to ‘Sales Target Data Set’ and select ‘Create dashboard’ in the ‘More’ menu.
  2. Select the fourth available template in the ‘Create dashboard’ screen. It has a large rectangle in the center with four rectangles equally spaced at the top.
  3. Click ‘Okay’. The ‘New dashboard’ screen will load.
  4. On the left side of the screen, click on the ‘Assistant’ icon.
  5. Type ‘Show average sales target for country’ and hit enter. Note that Sales target total is a sum measure, not an average. Cognos is able to compute new aggregate types on the fly using natural language query.
  6. Click and drag the map that appears into the ‘drop here to maximize’ icon in the center of the screen.
  7. Drag ‘Retailer region’ next to the ‘This tab’ icon at the top of the screen to add it to the tab filters.
  8. Click the ‘Retailer region’ filter and select ‘Northern Europe’. The map will zoom to extent.
  9. Add additional visualizations to the dashboard as you see fit.
  10. Save your dashboard as ‘Employee Target Dashboard’.

Combining data sets with other sources using data modules

Data sets alone are great for simple requests; for more complex modeling requirements containing multiple tables, complex relationships or relative time you are better off creating a data module. Data modules can contain multiple data set, data base or excel data sources and are great for IT and power users alike.

Creating a data module

  1. Click the ‘Plus’ button in the lower left side and choose ‘Data Module’
  2. In the ‘Select sources’ menu, select ‘Team Content’. Navigate to ‘your folder in ‘BACon users’
  3. Select ‘Sales Target Data Set’ and click ‘OK’. The data modules screen will open.
  4. Click the table ‘Sales Target Data Set’ in the source view. This will load a preview of the data from your data set into the ‘Grid’ tab. Here you can easily make comparison for join compatibility and instantly see the results of any filters, groups or calculations you build

Adding additional data sources

  1. Click the ‘+’ icon in the data source view and select ‘Add new sources’
  2. Click ‘Data servers and schemas’ and click ‘GOSALESDW/gosalesdw’. Click ‘OK’.
  3. Click ‘Discover related tables’ and explore the natural language model generation capability a bit. Then click ‘Previous’
  4. Click ‘Select tables’ and click ‘Next’.
  5. Choose ‘Go Time Dim’ from the available sources and click ‘OK’. This will add the Go Time Dim table to our data module.
  6. Repeat the above process to add:
    • Great_outdoor_sales data server
      • GOSALES/gosalesrt schema
        • Retailer table
        • Retailer Site table
        • Retailer Type table
    • Team content>BACon>BACon files
      • Sales data excel.csv
  7. At this point you should have six tables in your data module:
    • Sales Data Excel.csv
    • Retailer
    • Retailer Site
    • Retailer Type
    • Go Time Dim
    • Sales Target Data Set
  8. Save the model as ‘Sales Analysis Date Module’ in ‘your folder’.

Data cleanup and prep

  1. Click the arrow next to the ‘Go Time Dim’ table to expand it. Notice the large number of fields with a ruler icon. Cognos has incorrectly identified these as measures.
  2. Ctrl-click each ruler icon the click the ‘properties’ icon in the upper-right corner of the screen. This will open the properties window.
  3. Change ‘Usage’ from ‘Measure’ to ‘Attribute’. Change ‘Represents’ to ‘Time’.
  4. Click the ‘Month De’ field then shift-click the ‘Weekday Tr’ field. This will select all the non-english fields in the table.
  5. Click the ‘more’ button and select ‘Remove’. This removes these fields from the model.
  6. Repeat this process in the ‘Retailer Type’ table to leave only ‘Retailer Type Code’ and ‘Type Name En’.
  7. Save your work.

Simplify table structure

  1. Click the ‘more’ button next to ‘Sales Analysis Data Module’ in the source view. Select ‘Table’ in the menu that appears.
  2. Click ‘Select Tables’ and select the three retailer tables.
  3. Click ‘Create a view of tables’ and click ‘Next’
  4. Rename the new table ‘Retailer Dim’
  5. In the ‘Selected Items’ menu, select the following fields:
    • Retailer Code
    • Company Name
    • Rtl City
    • Rtl Prov State
    • Rtl Country Code
    • Type Name En
  6. Click ‘Refresh’ to check your work. Click ‘Finish’ to create your new table.
  7. Examine your handiwork in the ‘Custom tables’ tab. This tab makes understanding the data flow in your model much easier.
  8. Save your work.

Create relationships

  1. Ctrl-Click ‘Sales Data Excel.csv’ and ‘Go Time Dim’.
  2. Right-click one of them and choose ‘Relationship’ under the ‘New’ header. This will open the new relationship screen
  3. Click each table to select it, then right click and select ‘Create relationship’. The create relationship screen will open.
  4. Ensure the 1 indicator is under ‘Go Time Dim’ and the N indicator is under ‘Sales Data Excel.csv’. You can swap these by clicking the wheel icon in the lower left corner of the screen.
  5. Select ‘Day key’ from each table and click ‘Match selected columns’ This will create the necessary join criteria.
  6. Click ‘Refresh’ to see a preview of your join.
  7. Click OK to build the join.
  8. Repeat this process to create the following joins:
    • Go Time Dim 1-N Sales Target Data Set
    • Sales Target Data Set N – 1 Retailer Dim
    • Retailer Dim 1 – N Sales Data Excel.csv

Refining and enhancing the model

Adding additional business logic can greatly enhance the value of your data module and make it much easier for professional and self-service authors to quickly create valuable content.

Creating calculations

  1. Ctrl-click ‘Quantity’ and ‘Unit sale price’ in the Sales Data Excel table.
  2. Click the ‘more’ button and choose ‘create calculation’.
  3. Name your calculation ‘Sale Total’ and select ‘x’ from the calculation drop down menu.
  4. Note the ‘Calculate after aggregation’ option on this screen.
  5. Click ‘OK’
  6. Repeat the steps above to create the following calculations:
    1. Cost total: Quantity x Unit cost
    1. Sale margin: Sale total – Cost total
    1. Plan variance: Sale total % Planned revenue
  7. Click the ‘more’ button next to the Sales Data Excel table and select ‘calculation’. This will open the advanced editor.
  8. Name your calculation ‘Sales Margin’.
  9. Copy-paste the following logic:
    1. (Revenue – Cost_Total)/ Revenue
  10. Click the ‘more’ button next to ‘Sales margin’ and select ‘Format data’.
  11. Select ‘Percent’ in the drop down. Click ‘OK’.
  12. Ctrl-click ‘Cost total’ and ‘Sale total’. Click the ‘more’ button and select ‘format’
  13. Select ‘Currency’ in ‘Format type’.
  14. Select ‘USD’ in the ‘Currency’ drop down.
  15. Click ‘OK’

Adding drill paths

  1. Expand the ‘Go time dim’ table and click the ‘more’ button next to ‘Current Year’.
  2. Select ‘Create navigation’ path.
  3. Name the navigation path “Time’
  4. Drag ‘Current Quarter’, ‘Current Month’ and ‘Day Date’ into the navigation path. Click ‘OK’
  5. In the source view, click the ‘Identify navigation path members’ button.
  6. Repeat the process to create the Retailer location path:
    1. Rtl Country Code
    1. Rtl Prov State
    1. Rtl City
  7. You can create any hierarchy you want – feel free to make something crazy!

Adding relative time

  1. Click the ‘Add new sources’ button and select ‘Add new sources’.
  2. Navigate to ‘Team content>Calendars’ and select ‘Fiscal calendar’. Click OK. The Fiscal Calendar table should be visible in your module.
  3. Expand ‘Go Time Dim’ table and click ‘Day date’. Open the column properties.
  4. Under ‘Lookup reference’ select ‘Fiscal Calendar’
  5. Expand ‘Day Date’ and note the large number of relative time filters.
  6. Expand the ‘Sales Data Excel.csv’ table and shift-click the calculations we made above.
  7. Click the ‘more’ button and open the properties
  8. Set the lookup reference to Go Time Dim.Day Date.
  9. Expand any of the calculations and note the large number of relative time calculations.

Cleaning up for consumption

  1. Open the properties for ‘Sales Data Excel.csv’ and change the name to ‘Sales’.
  2. Change ‘Sales target data set’ to ‘Sales target’.
  3. Click the ‘more’ button next to ‘Sales Analysis Data Module’ and select ‘Folder’ in the ‘New’ section.
  4. Name the folder ‘Retailer tables’
  5. Click and drag the ‘Retailer’, ‘Retailer Site’ and ‘Retailer Type’ tables into the ‘Retailer tables’ folder. This will nest the tables within the folder.
  6. Click the ‘more’ button next to the ‘Retailer tables’ folder and select ‘Hide from users’.

Managing modules

Securing models

  1. Click the ‘Source view’ button.
  2. Expand ‘gosalesrt’ data server.
  3. Click the ‘more’ button next to ‘Retailer’ and select ‘set data security’.
  4. Click ‘Add security definition’ in the properties window. This will open the security window.
  5. Name your security filter ‘Analytics users’
  6. Expand the ‘Cognos’ namespace and select ‘Analytics users’. Click ‘OK’.
  7. Selected ‘Company Name’ from the Filters drop down and click ‘Add a filter’. The filter window will open.
  8. Select ‘4 golf only’ and click ‘OK’
  9. This sets the data level security at the data server level, not on the level of your individual model! Changes made here will effect ALL models built on this data server, for better or worse.
  10. Click ‘Cancel’ to leave the security window.

Model inheritance

  1. Click ‘New’ and select ‘Data module’
  2. Navigate to the ‘Sales Analysis Data Module’ located in ‘your folder’. Click ‘OK’
  3. Notice how each table is now turquoise with a ‘link’ icon on it. These are linked tables and will inherit changes made to their parent module.
  4. Check the ‘more’ menu on the ‘Sales’ table. Note that many options are missing. The same is true for individual data items.
  5. Note the ‘Break link’ option in the ‘more’ menu. This will disconnect the linked table from it’s parent and it will no longer inherit changes. The ability to click this button can be turned off in security.

Conclusion

In this workshop we combined a data set sourced from a Framework Manager package tables from a Microsoft SQL Server database and an Excel spreadsheet into a single data module, which we made available to self-service and professional authors.

I hope this workshop has unlocked a deeper understand of how the Cognos Analytics data prep features can drastically reduce the time it takes to acquire, model and visualize data for both self-service users and IT professionals. As always, please contact me at rdolley@pmsquare.com with any questions!

The 3 Cognos Query Modes

January 29, 2020 by Ryan Dolley 12 Comments

This is actually a sequel one of the very first blog posts I ever wrote for blueview. While a lot has changed since 2015 understanding the difference between Compatible Query Mode and Dynamic Query Mode is still crucial. With the addition of data sets running on the compute service (aka ‘Flint’) things are even more complicated. My goal for this article is the run back the clock, peel back the onion and give you a historical, technical and practical understanding of the 3 Cognos query modes. Buckle up folks, this is going to get wild.

All 3 cognos query modes are available in Cognos 11.1
There are many query engines hidden beneath the hood of your 2020 model Cognos

Compatible Query Mode

Compatible Query Mode is the query mode introduced in ReportNet (AKA, my junior year of college…). It is a 32-bit C++ query engine that runs on the Cognos application server as part of the BIBusTKServerMain process. CQM was the default query mode for new models created in Framework Manager up to Cognos 10.2.x, after which Dynamic Query Mode became the default. The majority of FM models I encounter were built in CQM and thus the majority of queries processed by Cognos are CQM. It remains a workhorse.

CQM relies exclusively on the report service to deliver query results
CQM resides within the Report Service

It is, however, an aging workhorse. Query speed is hampered by the limitations of 32-bit processes, particularly as it relates to RAM utilization. CQM does have a query cache but it runs on a per session, per user basis and in my experience causes more problems than it’s worth. Furthermore, Cognos 11 features either don’t work with CQM (data modules) or must simulate DQM when using CQM-based models (dashboards). This almost always works but of course fails whenever you need it most…

CQM works just fine and moving to DQM is not urgent, however I strongly advise you to do all new Framework Manager modeling in DQM (or even better, build data modules) and start seriously considering what a migration might look like.

Dynamic Query Mode and the Query Service

Dynamic Query Mode is the query mode introduced in Cognos 10.1. It is a 64-bit java query engine that runs as one or many java.exe process on the Cognos application server and is managed by the query service. The terms ‘DQM’, ‘query service’ and ‘XQE’ all essentially refer to this java process. All native Cognos Analytics features utilize DQM only – CQM queries execute in simulated DQM as mentioned above. You can see the criteria necessary for this to work here. DQM is both very powerful and very controversial among long time Cognoids. Let’s take a look at why.

DQM uses the query service to deliver results
DQM features dramatically improved query performance

What’s great about DQM?

DQM has a ton going for it. As a 64-bit process it can handle vastly greater amounts of data before dumping to disk. If configured and modeled properly, it features a shared in-memory data and member cache that dramatically improves interactive query performance for all users on the Cognos platform. It even filters cached query results by applying your security rules at run time.

DQM is tuned via Cognos administration and by a large number of governors in Framework manager to optimize join execution, aggregation and sorting. It handles extremely large data volumes, especially when combined with the basically defunct Dynamic Cubes feature. It even combines cached results with live SQL executed against a database on the fly. On its own. Like you don’t have to tell it to do that, it just does. Magic!

What’s not great about DQM?

Unfortunately given the list of excellent attributes above, DQM has some problems. It is very complex to understand, manage and tune and requires DMR models to fully utilize the all the caching features – consider that the DQM Redbook produced by IBM is 106 pages. A standalone tool exists called Query Analyzer dedicated to help you understand what the heck DQM is even doing as it plans and executes queries.

Migrating from CQM to DQM is often a complex project to evaluate and execute. I once provided a customer an LEO estimate of 8 – 32 weeks to complete a migration project. I have seen migrations take almost a year. I’ve seen things you people wouldn’t believe…

The purpose of this blog is not to push professional services but this is one instance where I think you really should contact PMsquare for help. But let’s say you have a ton of CQM models and don’t have the time to migrate them all. Is there a shortcut to high performance on large(ish) data volumes? Why yes, yes there is.

Data Sets and the Compute Service (aka ‘Flint’)

Data sets are an in-memory data processing engine first introduced in Cognos 11.0 and greatly enhanced in 11.1. Cognos 11.1 data sets run on the compute service aka ‘Flint’. The compute service is a 64-bit spark-sql process that is created and managed by the same query service that manages DQM, so it’s not really an independent cognos query mode. I will write a more in-depth article about data sets and Flint in the future, but let’s take a super quick look at how they work before we get into why they are amazing.

The compute service uses Spark SQL to deliver results
The compute service is a modern in-memory compute engine

How do data sets and the compute service work?

Data sets are not live connections to the underlying data like CQM or DQM – rather, they are a data extraction that is stored in a parquet file and loaded into the Cognos application server memory when needed for query processing. It works like this:

  • An end user creates a data set from an existing package, cube or data module OR uploads an excel file (the process is the same!)
  • Cognos fetches the necessary data and loads it into an Apache parquet file
  • The parquet file persists in the content store and is available to all application servers
  • When the query service on an application server requires a data set for query processing, it first checks to see if it has a local and up-to-date copy of the parquet file
  • If not, it fetches one
  • In either case, the parquet file is then loaded into the memory of the application server
  • Data is processed by the compute service using Spark SQL and results are returned to the query service
  • The query service receives results from the compute service and may perform additional processing if necessary
  • The results are then passed to the report service or batch report service for presentation

What makes data sets great?

They’re easy to build, easy to join and manipulate in data modules, easy to schedule and the performance is great. Once loaded into memory a data set is shared between users on the same application server. I have done multiple projects where I accomplish weeks or even months of ETL by getting fancy with data sets and data modules. No wonder they are my favorite of the Cognos query modes.

What’s even better is how data sets provide a radically shorter path to high performance, DQM and Spark based queries for your existing CQM models without having to commit to a full conversion. You simply use a CQM FM package as the basis for a data set, then utilize that data set as a source in a data module. Once complete, you’ve unlocked the full set of incredible data module and dashboard capabilities like forecasting without having to do an 8 to 32 week project.

Which Cognos Query Mode is right for me?

Okay that was a ton of data, some of it pretty technical. Which of the Cognos query modes should you choose and how do you learn more?

TLDR

  • Immediately cease all development of new Framework Manager models using CQM
  • Consider migrating existing CQM Framework Manager models to DQM models or to data modules (PMsquare can help with this)
  • Data sets are your ‘get out of CQM free’ card; they vastly improve the performance of most CQM queries and simplify presentaiton for end users

References

  • Dynamic Query Mode Redbook
  • Cognos DQM vs CQM explainer
  • Queries on uploaded files and data sets
  • Configuring the Compute service

Read on to up your Cognos game

  • Cognos Union Queries in Reports
  • Cognos Relative Dates in 11.2
  • The 2021 Gartner BI Magic Quadrant is Broken for Cognos Analytics
  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New

What being an IBM Champion means to me

January 16, 2020 by Ryan Dolley Leave a Comment

I received some great news on Tuesday from IBM – I have been selected as an IBM Champion again in 2020! And not only that, but my colleagues Cognos Paul Mendelson, Sonya Fournier and Mike DeGeus were selected as well. This brings our total IBM Champion count at PMsquare to 4! I’m extremely proud of our excellent team. We hire only the absolute best Cognos resources and IBM recognizes it. This post isn’t just for back slapping though, lets discuss what being an IBM Champion means to me.

What is an IBM Champion

The IBM Champion Logo

Directly from the IBM website:

IBM Champions demonstrate both expertise in and extraordinary support and advocacy for IBM technology, communities, and solutions.

The IBM Champion program recognizes these innovative thought leaders in the technical community and rewards these contributions by amplifying their voice and increasing their sphere of influence. IBM Champions are enthusiasts and advocates: IT professionals, business leaders, developers, executives, educators, and influencers who support and mentor others to help them get the most out of IBM software, solutions, and services.

The program runs on a yearly cycle and members must re-apply each year.

What does it mean to me?

Being an IBM champion means more than just a pat on the back and some SWAG. First and foremost it is a recognition of the work I do to feed the Cognos community with this blog, by moderating the Cognos subreddit and contributing to the IBM Analytics community site. Building our community is incredibly important to me and I am so very grateful for the feedback I get from you and from IBM. It keeps me going.

The program is also about being part of a huge community of IBM advocates. The slack is extremely active, extremely helpful and features the top experts on all IBM technologies in the world. The camaraderie in the program is quite amazing and I’ve made friends from all over the world. Meeting fellow champs is the most rewarding part of the program.

Finally, it’s about promoting your professional passion and working with IBM to improve their offerings. Champions receive next level access to the product teams for the respective technologies. If you’ve ever wondered how I know so much about the direction of the product this is a big part of it.

And SWAG, don’t forget the SWAG…


  • Cognos Union Queries in Reports
  • Cognos Relative Dates in 11.2
  • The 2021 Gartner BI Magic Quadrant is Broken for Cognos Analytics
  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New

Join me at Data and AI Forum 2019

October 11, 2019 by Ryan Dolley Leave a Comment

I’ll be attending Data and AI Forum again this year, as I do every year. Data and AI Forum is THE conference to attend for Cognos, Planning Analytics and other IBM analytics technologies. THINK is the big IBM conference but for us, this is the valuable one.

If you’re interested in saying hello you can find me at the PMsquare booth or attend one of my sessions:

If you’re struggling to modernize your Cognos environment, the second session is a treat.

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Copyright © 2023 · Atmosphere Pro on Genesis Framework · WordPress · Log in