Links to on demand webinars that I have completed for Dell Software

There are a number of Links posted below to on demand webinars that I have completed during my time in Dell talking about Big Data, Data Integration, Data Profiling.

How to Work Smarter Not Harder with Big Data

The End of Data Access Challenges

Five Ways to Ease the Data Blending Challenge

I hope you enjoy these webcasts and can learn from them.

Advertisements

Leave a comment

Filed under Uncategorized

Maximizing Big Data’s Value

Stephen Swoyer from TDWI conducted an interview with me earlier this year where we discussed how to start with Big Data, what are they key questions you should be asking as an enterprise before you start and how can you make the journey to Big Data success as painful as possible.  This article can be found here.

Leave a comment

Filed under Uncategorized

Weighing the Pros and Cons of the Data Lake Approach

Late last year I took part in a webcast with Elliot King, Unisphere Research analyst on how to weigh the pros and cons of the Data Lake for your organization – we discussed the five laws of data integration and how these must be the corner stone of any Hybrid Data Eco System to enable the enterprise to gain full actionable insight from their information stores.  This article can be found here with a recording of the wecast available

Leave a comment

Filed under Uncategorized

Q&A: How Both Sides Can Help Heal the IT-Business Rift

I took part in a Q&A session with Linda Briggs from TDWI on how both sides can help heal the rift between IT and the Business – that article can be found here

Leave a comment

Filed under Uncategorized

What Big Data Means for the Future of Self-service Business Intelligence

An article I wrote for Information Management can be found here on what Big Data means for the future of Self Service Business Intelligence

Leave a comment

Filed under Uncategorized

Achieving business intelligence right now (BIRN) without the IT Departments assistance

Addressing the pain and tension that is felt between the IT Departments and Business Units of major companies has been something that I have been doing for most of my career over the last 16 years.  Business units especially over the last few years have been clamouring for faster and more flexible access to the data that the IT Departments generate into relational databases, data warehouses and enterprise business intelligence systems.  In the following I describe a case history of one of my more notable successes in the battle to break down the barrier or ‘wall’ between IT and Business that is caused by this tension and leads to the high failure rates seen in Business Intelligence systems within companies  this is a precis of an article I wrote for the database journal last year

How to monitor, report and analyze service quality in near real-time:

A couple of years ago I was between major contracts and was contacted by one of my agencies that I have worked for before.  They had a department within a major retail bank who were having difficulties with a MS Access database which had been created to allow the department to track the results of customer satisfaction within the company.

The database although well written by an internal resource was quite rudimentary in its functionality and was only been used to store manual imports from excel and csv files on a monthly basis into various data tables.  Following the import the user would then have to follow a set of instructions to amend stored queries within the database to create meaningful results which could then be exported back to excel for the team to format into graphs.  Once exported the data was then used to manually create graphs and tables which would be added to a dashboard which was used to present the data to the business.  The problems as described by the head of the department were the fact that the database was slow and required two to three days of intensive work by a non technical resource to input the data and then create the reports and had produced inconsistent data due to the “human error” factor when amending the queries in the second step.

After viewing the database I agreed to hold a meeting with the major stakeholders to discuss their actually requirements and provide guidance on what they may actually require. Following the meeting it was obvious that the department required the following:

  •  The ability to store more than the 2 Gb limit of MS Access to allow trends to be forecast from stored data
  •  Automated upload of delivered files
  •  Automated production of the required reports including Dashboards and KPI’s
  •  Automated delivery of the resultant dashboards to the company

At the meeting I discussed my recommendations and suggested that this project would be an ideal scenario for a Data Warehouse BI Application utilising SQL Server and SSIS/SSRS to deliver dynamic content to the department and interested parties utilising a Sharepoint server.  The head of the department presented my recommendations to the companies IT Department and enquired about the feasibility of starting a project to deliver a BI System based on my recommendations unfortunately this however proved unsuccessful as the company had just being taken over and the IT Department were fully committed to supporting the change to the operational systems required by this.  At this point the head of department discussed with me if I thought there was any way I could assist.  Following discussions with a colleague who I had known from a previous role when I worked within the IT area it was established that I could gain access to an instance of MS SQL server from the business desktop via odbc which led me to believe that I could help.  I took this opportunity to build rapport with the CIO of the company and discuss my proposed solution to enable me to move forward with the project.  Following our meetings and discussions with his team he agreed that I could embark on delivering a BI suite utilising just those applications available on the standard desktop along with access to a SQL Server instance.

The main focus of this project was to move the application from a rudimentary MS Access Database to a fully fledged application utilising whatever applications and tools were available within the business area.  Investigation of the desktop established that MS Office 2003 Professional was installed on the desktop of every user along with Adobe Distiller 6.0.  This along with the availability of an instance of SQL Server 2005 led to the decision to convert the existing MS Access database to a MS Access Project connected to a SQL Server backend which would then utilise VBA and COM to automate all those manual processes including creation and delivery of the Dashboard.  Utilising a clean MS Access Project I connected to an instance of SQL Server 2005 on the companies’ development box and proceeded to convert the import routines from the old database into data loads and error checking routines utilising vba and sql server stored procedures to check the data on load.  Unfortunately due to the restrictions of no table creation imposed on the company SQL server and the unavailability of a SISS server or access to any SQL tools it was necessary to build permanent load tables to load the data into from the ADP.   To enable grouped and summed data to be used with the output of the ADP I adapted a dynamic pivot routine that I have used before within SQL Server 2005 – this provides very similar functionality to the Cross Tab Query within MS Access.

Once the data had been imported and saved correctly it was then down to the matter of delivering the reports – allowing the users to select using a form from the ADP and then using a module within vba to call a stored procedure to create the data required for the reports removed the “human error” side of the equation.  Once selected the reports ran in the background, creating an excel version of each report chosen utilising vba com calls to open excel on the clients machine, call an existing template and populate the data using ado record sets based on stored procedures .  These reports included monthly average data and results against targets, summary data based on yearly and quarterly stored and dynamically created data and the monthly dashboard which gave an overview of the companies’ performance against not only targets but also their competition but utilised automatically produced charts instead of figures.

After the excel reports had been verified by the team they then required the ability to create pdf versions of the documents to be automatically emailed to branches, divisions and regions.  This was achieved using pdf distiller which had been installed on the user’s machines as standard.  The emailing of the reports was achieved by leveraging the COM component of MS Access to talk to an SMTP server to create the mail item and attach the required reports and then despatch.  The SMTP server was utilised to avoid the recent updates to MS Outlook security which would have required a special script to have been written for the users despatching the mail to prevent the annoying pop up of the security warning which would have appeared for each report (at the lowest level this would be over 700).  To achieve a ‘sent’ item in the departments mail box a copy was sent to the department group mail box and a rule run on the incoming folder to transfer mails with a certain subject line into the sent folder of the mailbox.  Along with the produced reports and graphs the users were  also given the ability to generate reports as excel files to allow further investigatory work to be completed.

During the development stage and throughout the testing I updated the CIO of the company on progress through my IT contacts ensuring that at all stages the project manager responsible for the production area was aware of what was being proposed.  I completed a full suite of technical documentation including detailed breakdown of all code modules and sql server routines and stored procedures on completion of the project and at this point the business unit again contacted the CIO and the IT Project Manager responsible for desktop applications not with a request to build out a new project but with one to support an existing system.  Because of the constant contact between myself, the Business Unit Head and the CIO and Project Manager within IT there were no surprises and because the project had been formed within their guidelines approval was forthcoming to support the system.  It is possible with a little creativity and a lot of communication to provide a form of Business Intelligence to the broader community without utilising a Data Warehouse or any of the normal tools associated with either MOLAP or ROLAP storage and with little support of the IT Department.

The keys I believe to the successful completion of this type of project are:

  • Communication with all parties for the full life of the project.
  • Establishing at an early point the actual requirements of the business.
  • Full disclosure to the CIO/IT Department of the businesses requirements and also your development requirements.
  • An ability for the consultant employed to develop across numerous technologies and environments.
  • Full ITIL documentation on completion of the project.

Leave a comment

Filed under Business Intelligence, QuestBI

TDWI San Diego

Wow what a real interesting event – education by the bucket load and then the chance to explain to people about Quest Softwares entry into the BI Space and demonstrate our two tools to assist both IT and the Business to gain insight quickly.   I have had really positive feedback from everyone who came to visit and watch the demonstrations and look forward in the coming months to forging good relationships with our beta testing community.  Toad for Data Analyst 3.0 will be launched towards the end of September and Quest BI Studio will remain as a freeware beta programme until at least December – we want your help to make sure that the product fulfills all your needs.  Quest BI Studio already has along with TDA 3.0 full connectivity to most data sources including cloud and your corporate BI systems and with its easy to use graphical interface and three analytical views which are designed for the business user should allow faster time to analytics within your company.  Pleases feel free to post messages and questions on the community site.

2 Comments

Filed under Business Intelligence, QuestBI