Management Magazine

Researchers need new approach to data handling


The new model allows wide ranging matters to be researched concurrently in one swap, thereby saving on resources. 

Research practice has gradually shifted from a linear process to being predominantly modular hence increasing the availability of quality data in almost all areas including finance.  This means unlike in the past where a researcher, working as an individual or as a member of a team, would seek to accomplish the entire research project alone – from data collection to analysis and reporting; the modern practices encourage division of labour. Therefore, the specialists who collect data need not be the ones to process and store it.

Distinct roles in data handling

It is from this model that distinct specialisations have emerged. They include field researchers, data processors, statisticians and data scientists. Field researchers are the data collectors interested in capturing the raw data as accurately and suitably as possible. The data processors expertly input the data into best formats and index it for storage and handling (currently referred to as data warehousing). 

With increased use of computing technology, statisticians’ role is considerably moving from being the ‘mathematician’ who would retreat to the backroom with data and emerge days later with calculated answers. Computing technology has eased this process and statisticians now have a role in providing insights to how computer programs can produce better statistical output as well as inventing newer formulas.

Ultimately, the data scientists are at the end of the process. They retrieve the data through data mining and are more concerned with how they can make it communicate, especially visually to audiences. They strive to ensure that the output is recognised, appealing and convincing.

Unfortunately, many stakeholders both in the private and public sectors are yet to learn of this shift and are still stuck in the traditional processes which waste resources through unnecessary duplications. The new approach can be highly productive if pioneered by the Kenya National Bureau of Statistics as this would encourage many public-sector departments to build their own sector-specific datasets. The bureau would then concentrate on publishing summaries while relieving itself the cumbersome process of gathering and processing all the data. Embracing this change would enable the country to accumulate quality data that can be used by diverse interest groups. 

The gains of embracing the new model

One major advantage of this approach is that research becomes more impersonalised. It is thus protected from researchers’ inherent biases since different sets of experts take over at various stages. The complete project will be a product of collaborative work and hence more credible and reliable than the traditional one-man-show run data which is likely to be distorted.  Secondly, it enhances development of new specialties which concentrate on their core functions. The specialisations further invest in improved systems to make them more effective. 

Another advantage is that it allows wide ranging matters to be researched concurrently in one swap, thereby saving on resources. Researchers ask respondents diverse questions in an interview and avoid returning soon for other different engagements.  Interested stakeholders on their side select only data frames that they require. This model also provides an opportunity for stakeholders who include individuals, organisations and scholars to analyse the data by themselves as opposed to reading the results without participating in the process.  

The Case of Financial Services Deepening (FSD)

A proven example of this modern research practice is found in Financial Services Deepening research involvement. The organisation established in Kenya in 2013 has over the past few years built a reliable dataset from a consistent survey of 8,000 households spread across all the counties and equally targeting urban and rural dwellers.  

Of course, FSD specialises in financial and investment fields; thus, its output provides rich data for stakeholders in business, economics and investment. Given that the surveys are conducted on a regular basis, time series analyses can be done to obtain trends in an area of interest.  

The challenge that a stakeholder may encounter is in the decoding of the dataset. There could be need for recoding or reshaping of the data or datasets to allow access or any further handling of the resources depending on the data analysis tools at their disposal. Operations such as importing, splitting, merging, cleaning and exporting of the datasets or the charts can get somewhat intricate. 

Again, given that there are numerous data analyses software available, it could become confusing to determine the most appropriate for one’s needs.  Hence, consulting experts in electronic data processing, computerised statistics and data science may be necessary to get the information sought.

It would therefore, be foolhardy for an organisation or individual to embark on a study to establish levels of accessibility of banking, insurance, saving and investment products in Kenya yet, all they need to do is access existing datasets from FSD online resources. Other datasets are provided by international bodies such as the World Bank and IMF amongst other specialised organisations and agencies. 

Edwin Musonye is a Technical Communication professional at Document Point. Email: 

Related posts

Is coal energy viable for Kenya?


Stern move by President Paul Kagame

Derrick Vikiru

The 5-hour rule for successful people