Getting (even more) Serious about Data Quality and Governance
Ron Moore, MTG; Cameron Lackpour, CLSolve; & Joe Caserta, Caserta Concepts
Wednesday, March 12, 2014 6:00 AM - 7:00 AM AEDT
Nothing is more important than data quality. But if the steps to insure high data quality aren’t fast and easy people won’t do them – or at least they won’t do enough of them. It was always a difficult job and it consumes a lot of resources even with traditional data sources such as ERP that are relatively well behaved. Now analytics is spreading to more users and to data that’s far less well behaved. What should we be doing and how can we make it as fast and easy as possible?
In this webinar we will put those questions to our panelists and we will invite your opinions and questions. Some of our topics will include:
• Is data quality really a problem? Where and how much?
• Who has responsibility for data quality?
• What techniques can we apply at the data source?
• What techniques can we apply within Essbase and Planning?
• Can we adopt some “simple stupid rules” for DQ?
• What is the role of documentation?
• What documentation is effective and worth the effort?
Panelists include: Joe Caserta, Caserta Concepts; Cameron Lackpour, CLSolve; and Ron Moore, MTG. Cameron and Ron will bring an Essbase perspective. Joe will provide a data warehouse and big data perspective. Click here to register.
Five Steps to Redesigning the Financial Close Application
Brian Willson, TopDown Consulting
Wednesday, March 26, 2014 3:00 AM - 4:00 AM AEDT
When a company decides to rebuild or redesign its financial-close application, it is often influenced by a major event such as a merger or new reporting requirements. Whatever the case, this is a great opportunity to streamline the close process. Moreover, by designing an application that minimizes the amount of changes that require testing, you can effectively reduce the maintenance effort.
TopDown Consulting’s Solutions Architect Brian Wilson will address the five critical steps found in any redesign, including:
1. Defining the application’s utility: Clearly define what the application is used for, and cut the clutter. For example, if the primary objective of the application is to meet the requirements for external reporting, the application should only include data needed to produce a balance and income statement, and should leave out any data used primarily for internal reporting.
2. Reducing the number of accounts: This reduces the amount of time to load data and also reduces the number of accounts that need to be reconciled. In some cases, it can even reduce the amount of data that must be manually entered.
3. Reducing the number of entities: The more entities that must be consolidated, the longer the consolidation will take. Similarly, the more entity hierarchies that are in an application, the more consolidations will need to be run.
4. Reducing manual input: Manual data should be limited once data is in the consolidation application, particularly at the base entity level, as this level typically involves a large number of users. The more these users have to change the data after it is loaded, the more likely there is to be a delay. With differences in time zones, these delays can result in days added to the close process.
5. Designing the application with minimized maintenance in mind: At the end of the day, if maintenance is required during the close, it will slow the process. Instead, changes should be fully vetted in a test environment and then moved to production.
Click here to register.