• Skip to main content

IBM Blueview

Cognos Analytics and all things IBM

  • The Blog
  • Cognos Glossary
  • Cognos Resources
  • About Me
  • Categories
    • Cognos
      • Data Modules
      • Administration
      • Framework Manager
      • Dashboards
    • Opinion
    • Community Spotlight
  • PMsquare
  • Subscribe

Cognos

Cognos Analytics Forecasting Overview

November 7, 2019 by Ryan Dolley 7 Comments

Cognos Analytics 11.1.4 brings exciting and long sought forecasting capabilities to the platform. While Cognos is far from the first analytics tool to offer forecasting, IBM’s implementation is more flexible, more powerful and easier to use than competitors. Let’s take a dive into Cognos Analytics forecasting to understand why.

Cognos Analytics forecasting is a powerful and easy to use forecasting tool
Cognos Analytics forecasting is just really cool.

Where is forecasting available?

Forecasting is available in line, column and bar visualizations in Dashboards, Stories and Explorations as of 11.1.4. Report Authoring is on the horizon (in fact there’s a ton of cool stuff coming for Report Authoring.) The ability to forecast column and bar charts sets Cognos apart from Tableau and Power BI. Cognos hides the forecasting button until a visualization meets the following criteria:

  • Visualization is the proper type (line, column or bar)
  • Visualization has a measure in the Y axis
  • Visualization has a time element in the X axis

Once this criteria is satisfied the forecast button appears next to the insights button in the top right of the visualization.

You start from any line, column or bar chart
Forecasting is available for line, column and bar charts

In the above example I use the ‘Great outdoors data module’ available in the samples. Using the Sales table I apply Revenue to the Y axis and nest Year and Month on the X axis – the perfect scenario for forecasting.

Single category forecasts

One click of the forecast button is all it takes to build a predictive forecast
Forecasting a single category is simple

Click the forecast button, turn the feature on and there you have it – your very own Cognos Analytics forecast! You may immediately notice that the slope of the line changed dramatically once a forecast was applied. I first assumed this was a bug until I clicked the yellow ‘!’ next to the forecast button and read these magical words:

Cognos will even automatically re-order your time categories
Forecasting automatically orders time categories chronologically!

That’s right – Cognos automatically reordered the time categories to be in chronological order rather than alphanumeric! I’ve been waiting for this feature for all my life and it’s finally here. Now we just need IBM to automatically apply it to all visualizations, not just forecasts.

Multi-category forecasts

Adding additional categories builds a multi line or bar forecast
Forecasting multiple categories, just as easy

In the above example I add ‘Retailer Type’ from the Retailers table to the color attribute of the visualization. Cognos instantly re-applies the forecast to each individual category. This is one advantage Cognos has on Power BI, which currently forecasts a single category only. This makes data discovery on forecasts extremely fast and easy.

Interacting with forecasts

Hovering over an individual point in the forecast will show you the confidence upper bound, forecast and lower bound for that intersection of the chart:

Hover over tool tips provide additional context for forecast data points
Forecast popups show the confidence intervals calculated by Cognos

In this example I hover over the projection for ‘Outdoors Shop’ in September, 2018. You can see Cognos’ predictions for the upper and lower bound.

Focusing on a single category will visualization the confidence interval
You can visually see the upper and lower bound as well

Clicking on the line for ‘Outdoors Shop’ will filter the dashboard by this category, bring the selected category into visual focus and plot the confidence intervals as they evolve in the forecast. Super duper cool.

Configuring forecasts

Forecast configuration options are available in the menu that appears when the forecast button is clicked.

Cognos analytics forecasts have several configuration options to customize how the forecast is generated
Forecast configuration options help tailor results

Let’s take a look at what these options do:

  • Forecast periods: The number of periods included in the forecast. The default ‘Auto’ will project forward 20%. If you have 10 months of data, auto will generate an additional 2 months of forecast
  • Ignore last periods: Ignore last periods is useful for cases where you have incomplete data at the end of your chart – for example, the month of November currently has 6 days worth of actuals. Ignoring this incomplete period creates a more accurate forecast.
  • Confidence level: Controls the confidence level displayed on the tool tip and confidence visualization – options are 90%, 95% and 99%.
  • Seasonality: Cognos automatically detects seasonal fluctuations in your data and accounts for them in its forecasts – think about retailer revenue during the holidays for example. The default ‘Auto’ setting will build multiple models with different seasonal periods and and select the best one, however you can specify a season period by entering numbers here.

That last option is incredibly powerful and another strong advantage for Cognos vs Power BI and Tableau. The inability to detect and account for seasonal variation when forecasting renders this functionality useless for many industries – retail, hospitality, utilities – and only Cognos has it.

Understanding forecasts

Cognos uses exponential smoothing models to generate these forecasts. I don’t know what the means either but you can read about it here. The important thing to know is that you have access to the forecasting statistical details in the data tray.

Cognos analytics forecast statistical details are available for each category in the visualization
For those that know, here are the statistical details for the forecast

I’m a BI/DW guy and this screen might as well be magical incantations to me, but the fact that Cognos provides this level of detail means that I can always find a friendly wizard to explain it. IBM has a nice explanation that I also don’t understand here.

Bar chart forecasts

Switching to a column/bar chart retains the forecast and forecast parameters that were applied to your line chart.

Bar chart forecasts work just as well.

One thing it doesn’t do, which I just discovered in writing this blog post, is automatically sort the time category in chronological order like it does with line charts. I’m not sure if this is purposeful, an oversight or a bug. You can vote for my feature request to rectify this here.

Cognos Analytics forecasting is the real deal

I have to say, I think this feature is extremely well done and I’m certain end users are going to love it. If you need help prepping your environment for self-service see my guide here. Allowing your users to build their own data modules will make forecasting even more powerful, read more about it here.

IBM was late to the market with it but delivered something much better than their competitors. This has been a recurring story in the last year – Cognos has come so far since the 11.0 releases. It’s up to IBM – and to us – to make sure the broader BI community is aware of the good things going on in Cognos Analytics.

Cognos Analytics Linked Modules

October 29, 2019 by Ryan Dolley 4 Comments

If you’ve been following this blog you know that I love Cognos data modules to death. They are easy to learn, easy to use and full of functionality. Today we’re going to explore using Cognos Analytics Linked Modules to maintain enterprise data governance while enabling self-service. This entry is meant to give a straightforward implementation guide to reinforce concepts outlined in the posts Data Modules as a Single Source of Truth and How to Organize Cognos for Self-Service.

Companies do not have to choose between governance and self-service
Like Jedi vs Sith… can you truly have both?

How and why of linked modules

Using linked modules is a slam dunk way to enable self-service analytics on a foundation of trusted and governed data. When you link two modules you are a creating a relationship in which:

  • The IT expert can build and maintain a complex module with high quality data that requires significant skill to create
  • The power user can utilize high quality data as a foundation for analysis but customize and extend at their own pace, not IT’s
  • The casual user can build dashboards and run reports that meet their needs without waiting for IT

This works so well because the modules of the power user automatically inherit all the complex relationships, transformations and logic from the enterprise module prepared by IT in near real-time without having to do anything. It’s magic.

Step 1: Build an enterprise data module

The enterprise module is typically – but not always – built by IT, more often than not the result of a much longer process of data warehouse design and ETL. Regardless of how its created or what it contains the enterprise module will server as our example though any two modules can be linked together.

Any data module can serve as a source for another data module
The enterprise module serves as the source / parent when linking data modules

Our example uses ‘Great outdoors data module’ from the Cognos samples. This module is located in the data folder. Once built the enterprise module resides in a folder where self-service users can easily access it. I advocate creating a data library folder for this purpose.

Step 2: Use a data module as a source

IT has built and deployed an enterprise module with validated and accurate data, which takes a long time. Now a power user needs to extend that module with some additional context and can’t wait for IT to do it. The solution is simple: use Tableau Build a new data module using the enterprise module as a source!

Adding a data module source is no different from adding any other source
Adding a data module source is extremely easy, fun and profitable

When creating a new data module, users can select other modules as sources in the ‘select sources’ screen. In our example, select ‘Great outdoors data module’ and hit ‘OK’. That’s it. We’ve built a linked module.

Data module sources appear in teal and have a link icon
A linked module has visual cues to let you know that it is indeed a link

The tables above appear in teal rather than blue and have a link icon, indicating that they are actually inherited from another module. At this point any changes made to the tables of ‘Great outdoors data module’ will automatically flow through to this module with absolutely no intervention.

Step 3: Extend the enterprise module

You’ll notice that the teal linked tables do not offer much in the way customization by design. They inherit everything from the parent module. This means users cannot change what comes from the enterprise module but they can extend it with additional logic and data.

Linked tables don't have all the options of normal tables as you're trying to preserve governance.
No options = no opportunity to mess with governed data

Cognos restricts us to creating data groups, navigation paths and editing some properties for product line – a far cry from what is available in the source module. However we still have considerable power to extend the enterprise module. Let’s look at some examples.

Creating calculations

Linked module calcs work as expected

In the example above the calculated field ‘Revenue Variance Amount’ contains the fields ‘Revenue’ and ‘Planned Revenue’ from the sales table. This custom calculation is built entirely using elements inherited from our enterprise module. At many of my customers such a simple request spends weeks or months in the IT backlog but here we accomplish it in minutes while still using centrally modeled and governed data. Yowza!

Adding additional data

Adding additional data is easy

Here I join a Cognos data set called ‘Sales Staff Analysis’ to the tables from the enterprise module via two conjoined dimensions. Joining a spreadsheet is just as easy, which is what the vast majority of power users want to do.

These are just two examples of extending an enterprise module using the linked module functionality. Users can do almost everything data modules have to offer while still relying on trusted data, including building table views and aliases, creating drill paths, custom grouping and filter, row level security, etc…

Step 4: Watch the linking magic in action

Before the module can be used it must be saved. Users typically save modules to ‘My content’ but it is critically important that you allow them to save self-service modules to a location where they can easily share with coworkers. Now let’s close this module and test our module link.

Almost all changes made to the source module flow through

Here I create a new calculation in our enterprise module called ‘Revenue Percent Variance’ and click ‘Save’. I then immediately create a dashboard using the self-service module as a data source. Keep in mind that I made absolutely no updates to the self-service module before doing so.

The changes flow through just like magic!

As you can see, ‘Revenue Percent Variance’ is immediately available for use in Dashboards even when they reference the self-service module rather than the enterprise module.

Putting it all together

Allow me to summarize what we just did:

  1. An enterprise module is made available as a data source
  2. Power users extend the enterprise modules with their own logic and data in self-service modules
  3. Content is built using both the enterprise and self-service modules as a data source
  4. Updates to the enterprise module flow through to all content regardless of which data source was used
Single source of truth meets self-service

Nobody but Cognos offers this combination of data governance and self-service. It’s a game changer for organizations who truly embrace it. Book some time with me to explore these ideas further (there’s a link on the screen) and be sure to check out my live impressions of the recent 11.1.4 release.

How to Organize Cognos for self-service

October 28, 2019 by Ryan Dolley 4 Comments

Most Cognos environments are organized to support a world of IT authored report consumption – a fact that I confirm over and over as I present about Cognos modernization across the United States and am asked by beleaguered but hopeful looking Cognos admins, ‘How am I supposed to support all this self-service stuff?’ Never fear – here’s my take on how to organize Cognos.

Why your environment needs a re-org

11.1.4 is out, is excellent and you need to utilize all its features in production ASAP. Reticent clients sometimes object that their existing folder structure and security cannot support all the new features. Honestly they probably can’t. The typical IT managed legacy Cognos environment has some combination of the following:

  • High level folders that reflect the org structure (finance, HR, etc…) in which users can do nothing
  • Low level subject area folders nested within the org folders in which users can do nothing
  • A hidden data sources folder that users can’t find
  • A self-service folder where old timers use Query Studio
  • Non-prod environments where all real work must be done
  • A deployment cycle that takes weeks to bring finished reports to production

No path leads to modernization with these roadblocks to user adoption in place. Feel no shame, however, if this describes your environment in part or whole – this was absolutely best practice when most Cognos environments were first established.

Towards a new folder structure

Modern Cognos environments deliver consumer oriented experiences built around use cases rather than org structures or IT priorities. Redesigning folder structures and security to achieve this feels daunting – but it’s crucial. Here’s an example of what it looks like:

A modern Cognos Team Content folder structure

Curated Content

Curated content is the closest thing to a traditional IT-centric approach and is where a large portion of existing IT authored reports wind up. Mission critical reports live here alongside anything too high profile to mess up. Tight security and IT control are the name of the game here.

Organize cognos reports that are critical into the curated content location
Curated Content is where IT rules the roost

Nobody saves to curated content without IT approval and strict documentation. If your requirements involve audit, regulatory reporting or the CEO’s eyeballs it goes here.

Data Library

The data library is a semi-curated area and is THE PLACE to go to find data in your company. The data must be high quality and IT should exercise significant oversight, however data stewards and trusted power users have the ability to promote data to this location.

Organize cognos data sources into a single data library for easy access

Do not structure the data library by data source or modeling tool – that’s thinking like IT! It’s okay to have folders mingled with packages and data modules. Using linked modules to reference the data library maintains single source of truth.

Sensitive data targeted at only a specific department – HR data for example – can be located here as well provided you have appropriate object or data level security in place but may be best kept in department content

Department Content

Department content is a semi-curated area controlled by the data stewards and power users in each of your business units. Much of your existing content migrates here and becomes the responsibility of the business unit to secure, maintain and update. This means granting admin, modeling and report authoring powers to trusted individuals in the business.

Organize cognos content by department
Data stewards and power users maintain department content – not IT!

Yes this means dissolving the responsibility for maintaining department content folder security to the business. This is a shocking recommendation, I know, but this power belongs with those who know the data and user community best.

Help

Help contains everything to assist your users with Cognos. The Cognos samples go here. Example content you build goes here. Everyone has access to this location.

EDIT: As Jeremy points out in the comments, moving the ‘Templates’ folder from its original deployment location breaks the templates feature. Please leave it in its original location at the root of the ‘Team Content’ folder.

organize cognos help in one place to make it easy to find
Help – I need some Cognos! Help – not just any Cognos!

Self-Service

Self-service is hands off to IT. Anybody with access to Cognos can fully utilize all their capabilities in this location without asking permission first. That means making folders, building data modules, uploading excel files, etc. I mean it – everything!

Self-service sometimes feels like an MC escher painting but you must embrace the chaos!
A real life depiction of the self-service location in Cognos

You are indeed inviting chaos into Cognos, but this chaos already exists in your organization – it goes by the names ‘Tableau’ and ‘Power BI’. Giving full access to all Cognos features in the self-service folder encourages users to, well, use Cognos.

This is not to say you surrender all control – rather, you closely monitor what users are doing using the audit reports or a product like PMsquare’s Thrive and intervene as necessary under the following conditions:

  • Users are sharing inappropriate data
  • Users need help utilizing Cognos
  • Self-service content gains too wide an audience
Pmsquare Thrive will help
Thrive will tell you what they are doing…

The beauty of the self-service folder is this: When a user builds a dashboard based on spreadsheets that winds up being used by 200 people, you’ll know. And you and the relevant data steward from the business can transition that report to department reporting or curated content where it belongs.

Now you’re ready to organize Cognos

I hope this gives you some inspiration to deploy a more modern approach to your Cognos environment. There is much more to this than simply changing the folder structure, but a new folder structure is a crucial piece of the puzzle and one that I get asked about all the time. You’ll also need to consider things like persona-driven development, data governance with linked modules and thinking like an app instead of a report. I’ll cover all these topics and more going forward.

Cognos 11.1.4 Performance Tips

October 17, 2019 by Ryan Dolley Leave a Comment

Are you attending Data and AI Forum in Miami next week? Want to talk some Cognos with me? Snag a half hour of my time to chat free from sales guys.

Thank you to those who attended Tuesday’s 11.1.4 livestream! Technical issues aside I had a great time doing it and will be hosting more in the future. There were some comments about performance in recent releases that I promised to address but didn’t – so here are my Cognos 11.1.4 performance tips!

Tip one: Dispatcher > Gateway

Cognos 11 brought with it the option to dispense with the gateway entirely and sending your traffic directly to the dispatcher. This is, among other things, a performance hack that can pay big dividends, particularly if your Cognos environment takes an extremely long time to load or if the UI feels generally sluggish.

The downside here is that for most enterprise deployments relying on a single dispatcher is just not an option for a ton of reasons – including little things like SSO, load balancing and disaster recovery. But if you are able to try this you might find it helps quite a bit.

Tip Two: Reality > Minimum Requirements

The minimum requirements for Cognos 11.1.4 are currently listed at two CPUs. This is a straight up no-go for using Cognos in almost all circumstances regardless of what version you’re on so long as that version starts with the number 11.

A tale of two configs

I typically deploy a single-instance Cognos machine on AWS with the following specs and have no performance issues for demos or small deployments:

  • ec2 t3.xlarge
  • 4 vCPUs @ 2.5Ghz
  • 16GB RAM

With 11.1.4 this configuration suffers from a laggy UI and poor query performance, especially in Dashboards. I observed CPU utilization hitting 100% frequently. As a remedy I migrated my AMI to:

  • ec2 c4.4xlarge
  • 16 vCPUs @ 3.4Ghz
  • 32GB RAM

This adjustment resulted in extremely snappy performance throughout the UI and like-for-like dashboard query execution improved from ~25 seconds to ~2 seconds.

Why did this jump so much?

Cognos 11.1.4 is doing A LOT more than 11.0.x or 10.x ever dreamed of with a host of new interactive services running in the background – services that do things like natural language query, automatic visualization or dashboard creation, on-the-fly predictive forecasting, integrated data prep and modeling…

Tip Three: SSD > HDD

One often unexplored area of performance improvement for Cognos 11.1.4 (or anyone using Transformer for that matter) is increasing the speed of your HDD or migrating to an SSD. All of the new features rely on either the Cognos 10 DQM engine or the Cognos 11.1 ‘Flint’ spark-sql engine to process queries. In either case data is written to and read from disk so IO becomes larger factor in determining performance than ever before, especially for any interactive task.

Tip Four: Your Browser Matters Now

Go ahead and conduct an experiment for me – pop into Dashboards on your local machine, build a bunch of visualizations and observer your resource use. You may notice that Cognos running in Chrome is capable of maxing the CPU on your laptop. It is capable of maxing the CPU on my monster gaming desktop as well if I push it hard enough.

So the final place I’d look to juice performance is actually at the hardware your users are toting around with them. It may seem a little unfair but just imagine life as a poor Power BI or Tableau administrator – outdated and slow laptops define the entirety of their user experience.

Cognos 11.1.4 Performance Summary

All of this is to say that I have had success increasing performance by significantly increasing the specs of the application server. This isn’t necessarily shocking but it’s comforting in a way because it means the problem isn’t bad code. It also means that if you’ve been skating by on the same server config you put into place in 2015 it’s probably time for an upgrade.

Data Modules as a Single Source of Truth

August 1, 2019 by Ryan Dolley 10 Comments

The conversation spurred by my What are Cognos Data Modules blog post continues with an excellent question by Heather, who asks what to do about maintaining a single source of truth in Cognos Analytics while still using data modules. This does pose a challenge. I’ll discuss some options below and rate them using a scale of 1 – 5 Analysis Studio cubes.

Cognos Data Modules are a great single source of truth
Cognos Data Modules can server as a single source of truth, if you let them

Option 1: Don’t Use Data Modules

One common way to solve this problem – sadly – is to not use data modules in any capacity. Turn them off for end users and IT and stick with tried and true Framework Manager. If it aint’ broke, don’t fix it. This approach will preserve single source of the truth but at significant cost:

  • No access to new features… ever (FM is not receiving feature updates)
  • FM development cycles are very long
  • End users will have no ability to model data in Cognos…
  • Which means they won’t be able to do anything complex in Cognos…
  • Which means they’ll use Power BI instead!

Odds are if you’re reading this blog you’d prefer your users work in Cognos vs Power BI or Tableau. Choosing option 1 will definitely preserve single source of the truth but at the cost of relevance.

Final Verdict

Option 2: Treat Data Modules Exactly Like Framework Manager

Contrary to popular belief – and IBM’s marketing – data modules are not just for self-service users. The feature set of data modules has reached near parity to Framework Manager (with some notable exceptions) and has surpassed it in some very significant ways – relative time, for example. It’s also important to understand that Cognos security applies to a data module in just the same manner as a Framework Manager package.

There is in principle no reason you can’t migrate your IT modeling work to data modules wherever possible while restricting their use entirely to the IT team. In this way you can future proof your development and take advantage of a growing list of very cool features while leaving your traditional BI workflow and single source of truth intact.

However you foreclose the opportunity to take Cognos to the next level and enable real self-service.

Final Verdict

Option 3: Use Linked Tables

Linked tables allow to you use tables from one data module as the source for tables in another data module. These tables inherit all changes from the source, including field name changes, field additions/subtractions, calculation changes, SQL changes, etc…

Linked tables in action!

Using these links, it’s possible for IT to build and maintain a high quality, tested and accurate data model while still providing a significant degree of end user flexibility. In the image above, an end user has imported three linked tables over which IT has 100% control and joined them to an excel spreadsheet. To do this follow these steps:

  1. Create one or more data servers for your source databases
  2. Restrict access to create new models using those servers to IT only
  3. Build a data module that contains the tables you wish to govern
  4. Save the data module in a folder where users can access it but not overwrite it
  5. Remove the ‘break link’ capability from users. This means they cannot edit these tables in any way
  6. Teach them how to use link tables and let them go wild!

By following these steps you can ensure that all self-service data modeling for governed sources uses your logic, your calculations and even your joins – the joins you define are imported alongside your linked tables. Your users can still model to their heart’s content with their departmental or personal data but when it comes to your governed enterprise data, your rules come first.

Final Verdict

Option 4: Stop, collaborate and listen

For better or worse, speed and flexibility have decisively crushed accuracy and security in the battle for BI market share. End users routinely show up to meetings with three competing Power BI/Tableau workbooks that disagree on all relevant metrics but were produced in four days and look incredible. And people seem absolutely thrilled a this state of affairs based on recent events!

Therefore Option 4 isn’t so much a technical solution as it is a cultural one. Yes, you adopt the strategy outlined in Option 3 – but you do so cheerfully with the knowledge that your end users absolutely will colossally screw up the data in Cognos, and that it’s your job to identify and fix it when this happens. This is by far the hardest thing for long time Cognos pros to do, and it was extremely hard for me to come around to this way of thinking. The final piece of the puzzle was the following realization:

  • Users are incredible innovators in the art of producing bad data – they truly cannot be stopped
  • Trying to enforce 100% data accuracy in Cognos just drives them to other tools
  • Errors produced in those other tools are completely outside of my ability to monitor, influence or correct – as are those users
  • This cycle absolutely destroys ‘single version of truth’ for an organization as a whole, even if it is maintained for Cognos

Switching to a more open, collaborative approach in Cognos has the benefit increasing user satisfaction, organizational perception and eventually investment in Cognos. I have seen this work at my clients who have embraced it.

Final Verdict

That’s right, the final verdict for Option 4 is five PowerPlay Studio cubes of approval – the highest approval I can give.

PowerPlay was way better than Analysis Studio anyway.


  • Cognos Union Queries in Reports
  • Cognos Relative Dates in 11.2
  • The 2021 Gartner BI Magic Quadrant is Broken for Cognos Analytics
  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New

Alias Shortcuts in Cognos Data Module

July 30, 2019 by Ryan Dolley 5 Comments

My recent What are Cognos Data Modules post generated some interesting discussion around the alias shortcut functionality of Framework Manager and whether or not it is available in data modules. I initially responded by saying you could make data module alias tables, however SirM challenged me that it isn’t the same at all.

SirM is right.

What is an Alias in Cognos

In my experience the primary reason to build an alias is to tightly control how Cognos executes joins. Imagine you have two tables, Sales_Fact and Time_Dim. Sales_Fact contains the fields order_date and shipped_date, both of which you wish to join to Time_Dim. You could join the tables twice and be done with it, but then you are at the mercy of Cognos as to which join it will include when generating SQL. This will introduce unintended or completely incorrect results when the ‘wrong’ join is used.

Enter Framework Manager alias shortcuts. An alias shortcut is essentially a pointer to another table that has the following properties:

  • Inherits all fields and all changes from the target table automatically
  • Ignores all relationships from the target table
  • Can be repeated as often as necessary

In our example above, we make an alias shortcut of Time_Dim called ‘Shipped_Date’ and join the alias to Sales_Fact. Shipped_Date will inherit all fields from Time_Dim but will have an independent relationship to Sales_Fact.

Alias in Data Modules

Can we do this in data modules? In Cognos 11.0.x the answer was basically ‘No way!’ However recent releases have gone a long way to close this gap, especially once the custom table functionality was added in the 11.1.x release stream. Let’s explore the three options I recommend for building alias shortcuts in Cognos Analytics data modules and see which one fits best.

Copy Tables

Copying a table does exactly what you’d expect – it makes a copy of the table in question. You can create as many copies of a tables as you want, however changes you make to the target table do not flow through to the copied table in any capacity – whether you add or remove fields, rename fields or change calculation logic.

  • Inherits all fields and all changes from target table automatically: NO
  • Ignores all relationships from the target table: YES
  • Can be repeated as often as necessary: YES

View Tables

View tables are like copy tables with a couple very important differences; views can be composites of many underlying tables and they do inherit some changes from the target automatically. Specifically, they will inherit calculation changes made to the fields in the target table but will not inherit name changes or added/removed fields without manually editing the view definition. I will do a deep dive on their usage in a future post.

  • Inherits all fields and all changes from target table automatically: SORTA?
  • Ignores all relationships from the target table: YES
  • Can be repeated as often as necessary: YES

Linked Tables

Linked tables are essentially pointers to tables built and maintained in other data modules. They are extremely useful for BI teams concerned about data quality in Cognos Analytics, as you will see below.

Linked tables inherit all properties from the target table in the model in which they were built. This means that any changes I make in the target module, whether I add or remove fields, create new calculations or edit existing logic will automatically flow through to all linked tables that reference it. This would appear to solve our alias shortcut problem, so what’s the catch?

You can only import a linked table into a data module once. It cannot be copied and any views you build on it have the view limitations outlined above.

  • Inherits all fields and all changes from target table automatically: YES
  • Ignores all relationships from the target table: YES
  • Can be repeated as often as necessary: NO

So What’s the Solution?

Whenever you find yourself reaching for the alias shortcut button in data modules, ask yourself which alias table feature is most important for the task at hand.

  • If the most important thing is automated inheritance of all future changes, build a link table
  • If the most important thing is re-using the same table over and over, build a view table
  • In most instances, do not build a copy table

In reality, a view table should fit your needs in most circumstances. Yes, you will need to manually intervene to inherit certain changes listed above, however this process takes about 30 seconds per table. Not ideal but something most of us can live with.

How to Proceed

Alias tables have historically filled a very important role in building large scale Cognos models in Framework Manager, and their absence in data modules creates challenges to modeling the way we have for over a decade. Certainly many of my most skilled customers feel that without this functionality data modules don’t have much use.

It’s worth considering then how it is that Tableau and Power BI managed to dominate the mid-late 2010s BI market with a total absence of an equivalent to alias shortcuts or many other ‘enterprise’ BI modeling features? The answer lies in the culture and practice that these solutions enabled which delivered faster results for business users. How to apply these practices to Cognos using data modules will be a major theme of this blog going forward.


Level up your Cognos skills with these helpful articles!

  • Cognos Union Queries in Reports
  • Cognos Relative Dates in 11.2
  • The 2021 Gartner BI Magic Quadrant is Broken for Cognos Analytics
  • Data Modeling for Success: BACon 2020
  • Cognos Analytics 11.1.6 What’s New
  • « Go to Previous Page
  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to page 4
  • Go to Next Page »

Copyright © 2023 · Atmosphere Pro on Genesis Framework · WordPress · Log in