Category Archives: QuestBI

Synication to Quest Business Intelligence

Achieving business intelligence right now (BIRN) without the IT Departments assistance

Addressing the pain and tension that is felt between the IT Departments and Business Units of major companies has been something that I have been doing for most of my career over the last 16 years.  Business units especially over the last few years have been clamouring for faster and more flexible access to the data that the IT Departments generate into relational databases, data warehouses and enterprise business intelligence systems.  In the following I describe a case history of one of my more notable successes in the battle to break down the barrier or ‘wall’ between IT and Business that is caused by this tension and leads to the high failure rates seen in Business Intelligence systems within companies  this is a precis of an article I wrote for the database journal last year

How to monitor, report and analyze service quality in near real-time:

A couple of years ago I was between major contracts and was contacted by one of my agencies that I have worked for before.  They had a department within a major retail bank who were having difficulties with a MS Access database which had been created to allow the department to track the results of customer satisfaction within the company.

The database although well written by an internal resource was quite rudimentary in its functionality and was only been used to store manual imports from excel and csv files on a monthly basis into various data tables.  Following the import the user would then have to follow a set of instructions to amend stored queries within the database to create meaningful results which could then be exported back to excel for the team to format into graphs.  Once exported the data was then used to manually create graphs and tables which would be added to a dashboard which was used to present the data to the business.  The problems as described by the head of the department were the fact that the database was slow and required two to three days of intensive work by a non technical resource to input the data and then create the reports and had produced inconsistent data due to the “human error” factor when amending the queries in the second step.

After viewing the database I agreed to hold a meeting with the major stakeholders to discuss their actually requirements and provide guidance on what they may actually require. Following the meeting it was obvious that the department required the following:

  •  The ability to store more than the 2 Gb limit of MS Access to allow trends to be forecast from stored data
  •  Automated upload of delivered files
  •  Automated production of the required reports including Dashboards and KPI’s
  •  Automated delivery of the resultant dashboards to the company

At the meeting I discussed my recommendations and suggested that this project would be an ideal scenario for a Data Warehouse BI Application utilising SQL Server and SSIS/SSRS to deliver dynamic content to the department and interested parties utilising a Sharepoint server.  The head of the department presented my recommendations to the companies IT Department and enquired about the feasibility of starting a project to deliver a BI System based on my recommendations unfortunately this however proved unsuccessful as the company had just being taken over and the IT Department were fully committed to supporting the change to the operational systems required by this.  At this point the head of department discussed with me if I thought there was any way I could assist.  Following discussions with a colleague who I had known from a previous role when I worked within the IT area it was established that I could gain access to an instance of MS SQL server from the business desktop via odbc which led me to believe that I could help.  I took this opportunity to build rapport with the CIO of the company and discuss my proposed solution to enable me to move forward with the project.  Following our meetings and discussions with his team he agreed that I could embark on delivering a BI suite utilising just those applications available on the standard desktop along with access to a SQL Server instance.

The main focus of this project was to move the application from a rudimentary MS Access Database to a fully fledged application utilising whatever applications and tools were available within the business area.  Investigation of the desktop established that MS Office 2003 Professional was installed on the desktop of every user along with Adobe Distiller 6.0.  This along with the availability of an instance of SQL Server 2005 led to the decision to convert the existing MS Access database to a MS Access Project connected to a SQL Server backend which would then utilise VBA and COM to automate all those manual processes including creation and delivery of the Dashboard.  Utilising a clean MS Access Project I connected to an instance of SQL Server 2005 on the companies’ development box and proceeded to convert the import routines from the old database into data loads and error checking routines utilising vba and sql server stored procedures to check the data on load.  Unfortunately due to the restrictions of no table creation imposed on the company SQL server and the unavailability of a SISS server or access to any SQL tools it was necessary to build permanent load tables to load the data into from the ADP.   To enable grouped and summed data to be used with the output of the ADP I adapted a dynamic pivot routine that I have used before within SQL Server 2005 – this provides very similar functionality to the Cross Tab Query within MS Access.

Once the data had been imported and saved correctly it was then down to the matter of delivering the reports – allowing the users to select using a form from the ADP and then using a module within vba to call a stored procedure to create the data required for the reports removed the “human error” side of the equation.  Once selected the reports ran in the background, creating an excel version of each report chosen utilising vba com calls to open excel on the clients machine, call an existing template and populate the data using ado record sets based on stored procedures .  These reports included monthly average data and results against targets, summary data based on yearly and quarterly stored and dynamically created data and the monthly dashboard which gave an overview of the companies’ performance against not only targets but also their competition but utilised automatically produced charts instead of figures.

After the excel reports had been verified by the team they then required the ability to create pdf versions of the documents to be automatically emailed to branches, divisions and regions.  This was achieved using pdf distiller which had been installed on the user’s machines as standard.  The emailing of the reports was achieved by leveraging the COM component of MS Access to talk to an SMTP server to create the mail item and attach the required reports and then despatch.  The SMTP server was utilised to avoid the recent updates to MS Outlook security which would have required a special script to have been written for the users despatching the mail to prevent the annoying pop up of the security warning which would have appeared for each report (at the lowest level this would be over 700).  To achieve a ‘sent’ item in the departments mail box a copy was sent to the department group mail box and a rule run on the incoming folder to transfer mails with a certain subject line into the sent folder of the mailbox.  Along with the produced reports and graphs the users were  also given the ability to generate reports as excel files to allow further investigatory work to be completed.

During the development stage and throughout the testing I updated the CIO of the company on progress through my IT contacts ensuring that at all stages the project manager responsible for the production area was aware of what was being proposed.  I completed a full suite of technical documentation including detailed breakdown of all code modules and sql server routines and stored procedures on completion of the project and at this point the business unit again contacted the CIO and the IT Project Manager responsible for desktop applications not with a request to build out a new project but with one to support an existing system.  Because of the constant contact between myself, the Business Unit Head and the CIO and Project Manager within IT there were no surprises and because the project had been formed within their guidelines approval was forthcoming to support the system.  It is possible with a little creativity and a lot of communication to provide a form of Business Intelligence to the broader community without utilising a Data Warehouse or any of the normal tools associated with either MOLAP or ROLAP storage and with little support of the IT Department.

The keys I believe to the successful completion of this type of project are:

  • Communication with all parties for the full life of the project.
  • Establishing at an early point the actual requirements of the business.
  • Full disclosure to the CIO/IT Department of the businesses requirements and also your development requirements.
  • An ability for the consultant employed to develop across numerous technologies and environments.
  • Full ITIL documentation on completion of the project.
Advertisement

Leave a comment

Filed under Business Intelligence, QuestBI

TDWI San Diego

Wow what a real interesting event – education by the bucket load and then the chance to explain to people about Quest Softwares entry into the BI Space and demonstrate our two tools to assist both IT and the Business to gain insight quickly.   I have had really positive feedback from everyone who came to visit and watch the demonstrations and look forward in the coming months to forging good relationships with our beta testing community.  Toad for Data Analyst 3.0 will be launched towards the end of September and Quest BI Studio will remain as a freeware beta programme until at least December – we want your help to make sure that the product fulfills all your needs.  Quest BI Studio already has along with TDA 3.0 full connectivity to most data sources including cloud and your corporate BI systems and with its easy to use graphical interface and three analytical views which are designed for the business user should allow faster time to analytics within your company.  Pleases feel free to post messages and questions on the community site.

2 Comments

Filed under Business Intelligence, QuestBI

Big Data Hype in Europe

I was recently asked to contribute to the IDG Connect blog – my article went up on the site yesterday please feel free to comment.  http://bit.ly/pL7FZS

Leave a comment

Filed under Big Data, QuestBI

Business Intelligence for the Business and IT

Okay so as I write I am on the train on my way to discuss a problem that I see all too often in corporate business today.  I am travelling to talk to a group of Business Users who work for a major retail bank here in the UK.  I am trying to identify how they use data, how they connect to their data sources currently and how they use that data to complete analysis to provide useful insight to drive decision-making within the company.

As I have spoken about before on LinkedIn and other web related material one of the major hurdles to a successful Business Intelligence strategy within any company regardless of the area they are based in is the gap between IT and the BI team.   This gap is probably responsible for 90% of the failures we see with corporate BI solutions.  The dichotomy of the standpoints of IT – who want data governance and security above all, and the Business – who want result driven decision-making or Business Intelligence Right Now (BIRN)  is what is responsible for this failure rate.  It is not that the companies IT team does not want to help – or that the Business wants to go and do its own thing – it is just that no one has yet come up with a good solution to let both parties play in the same sandbox without upsetting each other.

My discussions tomorrow are aimed at discovering what the frustrations are that are felt  by the business user in a large multi national organisation and how they are currently solving their problems.  I will then back this up with a follow-up meeting with the IT stakeholders to clarify their point of view.

I believe that only with a greater understanding of how both sides think of the problem and then through a process of mutual understanding and engagement can we identify the needs of both parties and hope to develop tools to satisfy those needs.

Leave a comment

Filed under QuestBI, Uncategorized

To the Cloud

Can the Cloud provide the processor intensive computing power and storage facilities required by a major corporation looking to replace their manpower, energy and space intensive current BI solution?  The Cloud provides an unlimited pool of computing power, memory and storage which are delivered in affordable discreet modules to the end user.  This business model which delivers unlimited scalability with very little overhead is undoubtedly appealing to the corporate finance departments of many major corporations and I believe from my experience and with careful planning can be utilised by any company.

The following considerations must however be factored into the decision to adopt BI in the cloud as an enterprise for your company:

  • Plan for the worst
  • Perform due diligence for security, backup and disaster recovery
  • Do not overlook BI Cloud pricing and contract matters
  • Evaluate the long term cost of ownership
  • Investigate license requirements
  • Consider your data transfer requirements

Many of the major vendors in the BI community are now actively seeking a presence in the Cloud BI arena, SAP BusinessObjects BI OnDemand,  Microsoft Azure and IBM Blue Insight whether they succeed or fail in their enterprises will inevitably drive the use of the Cloud for mainstream BI solutions.  I believe however that BI in the Cloud will primarily be utilised by the major corporations as a development tool to reduce overhead costs.  It represents a way for a BI application to be developed, installed and adapted to need with reduced costs and easier deployment without the need for capital investment in hardware and infrastructure space.

Pros: 

  • Cost
  • Speed of Deployment
  • Scalability
  • Ease of Access especially for Power users and Analysts

Cons: 

  • Data transfer rates – especially for data sets of a terabyte or more.
  • SaaS offerings especially need to be specially tailored for the data they are linking into.
  • The possibility that the vendor your company chooses in this start-up stage may not survive to support your long term needs.

Conclusion: 

The fact that the Cloud can support BI is not at question here the requirement is that it supports major corporation’s BI needs.  I believe that the ability to provide a pay as you compute infrastructure will provide new data warehouse storage options and provide the possibility of unlimited scalability within your corporate environment.  However these major plus points must be tempered with a realisation that BI in the cloud is still in its infancy.  Major consideration must be given to security, data transfer rates and the chosen vendors risk within the market place.  If your company can perform due diligence to satisfy themselves that these possible stumbling blocks can be offset then there is the ability for the cloud to satisfy the most demanding of companies.  I believe that the main users of cloud BI as a main source of BI for their companies will be SMEs and that larger corporations will utilise the cloud as a sandbox for pre-deployment development and testing.

Leave a comment

Filed under Cloud BI, QuestBI

Hardware and Memory

Cheap Memory and Multi Core.  With the cost of memory at it’s lowest for years and the new developments in multi-core processors hitting the market place the way we think about building and implementing a Business Intelligence solution has changed.  The ability to run data warehouses in memory and the processing power available will allow the way data is handled not just in the business area but also in the corporate semantic layer to change.  The mega vendors will not lose their position within the corporate area because of the massive amounts of funds already invested in the solution but they must embrace the challenge of providing access to the corporate layer at the same speed that other areas of information can be accessed.  You cannot have a user accessing sales or marketing data utilizing a ‘Google’ type search system from the internet or departmental sources with all its inherent speed only to have to wait for IT to deliver matching governed product data which has been sanitized two weeks later – this is of no use and in certain businesses would be pointless.  This seismic shift in the way that information can and will be delivered in the future will ensure that businesses can gain insight in near real time into the reasons for their failure or success in the market – enabling decision change to be made quickly will ensure that companies can be more profitable in these uncertain economic times.

Leave a comment

Filed under Business Intelligence, QuestBI

Where are BI Systems, Applications and Tools heading in the next ten years?

I am very excited with all things BI at the moment, a lot of the mega vendors are tearing around in development trying to catch up with the independent vendors who have realized that the future of Business Intelligence is based upon the idea that the end users who need the insight into their businesses are the people who really need to control what, how and when BI is delivered.  These vendors are having a major impact on the BI Space – they are innovative, progressive and determined to succeed in an area which for the last ten years has seen a 90% failure rate.

The following are my picks for the requirements for future systems – at all levels of the enterprise -which will be needed to succeed in the BI space of the future:

  • In Memory Analytics that the user controls in real time
  • Simplicity of Data Search, Detection and Integration
  • Ability to connect to any and all data sources including the Corporate BI semantic layer
  • Speed, Speed and more Speed
  • Ease of use
  • Collaboration
  • Continuity and governance of corporate BI and business statistical data

More to follow:

Leave a comment

Filed under Business Intelligence, QuestBI

Self Service BI

Until very recently I have spent most of my consulting career designing and constructing tactical Business Intelligence (BI) tools for Business Users sometimes with the co-operation of the IT Department and sometimes in direct competition to them.  Because of this I have been cautious when investigating the new trend of Self Service BI which claims to provide a complete solution to the business user by providing advance visualization, data discovery and integration and visual analytics.

Sat at home one evening recently I was pondering over the problems that face both the IT Department and the Business User when searching for the truth amongst the myriad sources of information prevalent within most organizations.  In the office a request to search for information requires the business user to submit a requirement to the IT department to supply a snap shot of data the user then has to wait until this request is fulfilled before they can take the data supplied, extract the exact figures they want and then marry this data to department data to actually get to the insight that they are looking for.  Compare this with the near instantaneous results of a search request to any one of the major search suppliers on the web whilst on the internet and you can completely understand the user’s frustration.

The new swath of vendors offering to supply the answer to this problem have realized that the business users have become disappointed by the inflexibility over the last ten years of Corporate BI implementations to provide insight into their corporate data.  These new “Self Service” tools being currently offered seem to be the solution to this problem in that they combine interactive visual analytics with data integration and near instant results for even complex BI cases.  This means that the new vendors are ensuring that their sales effort is placed firmly with the Business Community in most cases to the detriment of the IT Community.

The major concern that I have with regards to these tools is governance of data or to put it another way “One Version of the Truth”.  The need for Business Intelligence Right Now (BIRN) to support critical business decisions based on insight into multiple data sources is well understood by the Business  however as they cannot verify the data once it has left the rarified air of the BI Semantic Layer they cannot guarantee it is the one version of the truth.  This is also a major problem with most if not all of the current tools on the market supplied by various vendors is that they also cannot guarantee that they have governed data as most either store the data locally or only use a snapshot.

This problem has been identified by Quest Software who currently has in development a number of tools which will aid collaboration between not just IT and the business but also between business departments utilizing the “One Version of the Truth” as the core of their systems.   Self-Service BI is not a myth, but with too many supposed “self-service” solutions are being designed (knowingly or not) for a the more technical user who sometimes identify themselves as “Data Analysts” and have far too much complexity for the normal Business Analyst (BA).  The BA has no problem with a pivot table in excel and understanding the business relationships of the data but when faced with a requirement to understand SQL or Data Architecture struggles.  Taking this into consideration Quest has designed two separate tools one for the Technical user and one for the business user.  Both are easy to use, intuitive and have the ability to integrate and analyses data from many disparate sources and then provide visualization and basic dashboard reporting their critical difference however is that they supply a secure connection to the company’s BI infrastructure to provide the governed data required to make the insight provided correct.

Leave a comment

Filed under Business Intelligence, QuestBI

Data Mashup in BI

Data mashup within Business Intelligence (BI) applications is one of the latest must have requirements that I have been asked to build into solutions that I have created for some major enterprise clients recently.  As mentioned in my last blog about the growth of NoSQL use within the BI area I believe that the mashup can be another useful tool to be added to the BI solution – but it must not be utilized to the detriment of the core principles of any BI solution.  The first types of mashup used mapping services or photo services and combined these with relational or excel data to create a visualization of the data.  In the beginning, most mashups were consumer-based, but recently the mashup has started to interest the wider enterprise.  Business mashups can combine existing internal data with external services to create new views on the data.

The problem with the clamour for the most up to date technology to be added to the BI application stack is as always the differing understanding between the business user and the developer to how they can be used.  The battle from the developer’s point of view is trying to get the people within businesses to understand what, how and where the mashup can be utilised within their organisations.  The problem from the business point of view is trying to emphasize the speed at which this development must be available to the business.  Mashups can enable nontechnical users to build dynamic views of disparate data that are personalized, context-rich, role-tailored, and ad hoc to explore this data in greater depth.  However the problem with most of the currently available BI Vendors mashup applications or plug ins is that they simply offer a numerical analysis of data via the normal OLAP cube route and then attach a search bar alongside this analysis to enable a search of separate silos of either textual, web or unstructured content to match up with the data already recovered and analysis.

The ability of a mashup to pull content from other sources is what most business users are excited about and this combined with the ability to store non structured data in a NoSQL environment which allows for rapid search and retrieval and storage of any and all linked data.  Most corporations are now requesting that BI systems have the ability to interrogate social networking sites to find out what is being said about their products – this is a perfect example of the ability of mashups to provide information that most marketing mangers and sales teams desperately need to understand to improve business productivity and sales success.  This requirement to link to all types of data also needs to be paired with the ability to interrogate all systems that are available within the corporate environment – there is no point in having a BI application which has the ability to mashup data if it cannot attach to all the clients information.  These results should also be shown not only in their normal context but in a context that is easy to understand and use for the customer.

I see mashups extending the current traditional data-driven BI solutions to incorporate traditional planned data from a normal RDBMS or OLAP cube adding in unstructured data and accessing further information from either RSS or the web utilising web services.  Most of the modern BI Solutions can solve the first two connections but to connecting to the web can require either a web service to be hand coded or the purchase of one of the specific connection applications available currently on the web.  As reported last week in the Briefing Room with Mark Madsen http://bit.ly/l0C1Cy , Quest Software are about to bring to the market a group of tools aimed at both the Data Analyst and the Business Analyst/User which will allow for the full range of mashup capability to be available on the desktop for both those in IT and those in the Business.  This can only help to improve the harmony between these two areas of the business which will in turn allow them to deliver dramatically better business results than when utilising traditional business intelligence (BI) systems.

Leave a comment

Filed under Data Mashup, QuestBI

NoSQL and its Role in the BI Arena

Business intelligence applications are moving from the traditional connection to an OLAP Data source based on relational database systems to the ability to link to and consume data from a variety of disparate sources including social networks.  The ability for a modern BI application to be able to use mashups of data to provide agility when dealing with integrations of multiple types of data sources has led to NoSql being promoted by many as the next big thing within BI.  Does this mean that we have seen the end of the SQL style RDBMS system within the BI area – there are many pros and cons for both systems but I believe that there are still a place for both within the BI arena.

NoSQL implementations like Cassandra and Dynamo can scale out past the terabyte and on to the petabyte size by utilizing horizontal scaling and multiple nodes and in particular the costs differences associated between SQL and NoSQL implementations are significant.  However each type of NoSQL system uses its own proprietary code for its connections and the system is usually set up for a particular model which enables super fast performance but does hinder the ability to run any adhoc queries on the data.

Companies are now looking to connect to social networking data to enable them to trend sales and customer selections.  This data is very unstructured and most is in the form of NoSQL (Twitter, Facebook etc).   The problem as I see it for most major business clients is who within their organizations to use to implement a NoSQL BI solution.  Most of the requirements of a BI system – large data sets, speedy recovery of data, and display of results to all business users – can be implemented utilizing a NoSQL data set; however the technology does require a different type of technical resource.    One possible solution to this problem could be the Toad for Cloud database application by Quest software which I am just starting to look at in more detail – this shows great promise in its ability to interrogate cloud style NoSQL databases like Cassandra, HBase and Azure with SQL terminology and to allow transfers of data between NoSQL databases and SQL databases.

The trade off for NoSQL database is their lack of ACID and their ability to support adhoc querying.  Utilizing SQL RDBMS allows us to use standard connections between servers and clients especially those stalwarts of BI Reporting, crystal reports or business objects.  It also allows for clean easy connections when utilizing the most popular of object frameworks like dot Net or xml.   Normal IT departments normally have at least one SQL data access language expert in their ranks – this allows them to at least understand a BI implementation based on a SQL RDBMS.

In respect of BI my experience has led me to believe that for the majority of EPOS based customers utilizing a RDBMS SQL based application with the possibility of a star based data warehouse will suffice and provide both transactional integrity and the ability to scale as required.  There will of course be exceptions to this model including both the requirement to scale out past the  Petabyte mark and a requirement for superfast results and it is at this point that I believe the NoSQL solutions can and should be investigated.  I believe that both SQL and NoSQL applications will be implemented side by side in many organizations in the future especially as the drive to include social networking data in our results is realized.  Many BI specialists including myself already utilize a plethora of specialized tools to deliver results to the customer – I cannot see any reason for not adding NoSQL into the tool box.

Leave a comment

Filed under NoSQL, QuestBI