Quality Assurance and its Impact in Data Aggregation
by Folakunle Olajide.
Data aggregation is an important aspect of information technology.
According to import.io, data aggregation is the process of gathering data and presenting it in a summarized format. The data may as well be gathered from multiple data sources with the intent of combining the data sources into a summary for data analysis.
Over a period of time, a lot of money and time has been lost in Africa trying to track crime and other cyber-security frauds. Data aggregation seeks to harness multiple data to be able to deliver the required data or information which is useful in so many ways such as analytics, cybersecurity, tracking perpetrators of crime such as kidnapping, armed robbery, identity theft and more.
I watched a documentary sometimes ago about a man named Nicholas Winton, who rescued hundreds of children that that was bound to perish as a result of the holocaust in 1939 in the Czech Republic. He found new families for the children in the UK who raised them. He did not tell his wife of this feat until about 50 years later when his wife found a scrapbook that contained the data of all 669 children.
His wife shared the story with some people, and it was eventually featured on a BBC television program. Some of the children he rescued turned out to be very successful and they were all present at the television program in honour of Sir Nicholas Winton which was planned as a surprise. All this was possible as a result of good data documentation.
Note, the process of data aggregation will not be achieved without putting adequate structures and processes in place. Over the years, quality assurance has been known to impact positively on this aspect of information technology through the following ways:
According to Fsnnetwork.org
Youverify, a company that is at the forefront of harnessing these standards to change the process of data aggregation across Africa has built products such as BVN services with facial match, NIN services with a facial match.
These products match the face of the candidates against their profile, this seeks to validate the facial data credentials of the candidates. Quality assurance ensures that the results are valid by carrying out multiple series of testing.
Reliability is a factor that is taken into consideration by the fact that our quality assurance structures ensure our application programming interface is up 99.9% and would not experience as much as an insignificant downtime. Quality assurance performs regression testing on our products from time to time.
Precision cannot be downplayed as it is evident in our product known as the agent training platform (portal) that seeks to train our verification agents how to carry out their jobs with accuracy and precision by watching training videos which they are expected to pass by a threshold score of 100% before they can advance to become an agent.
The integrity of the data is one of the most important qualities of data aggregation. The sources should be tested and trusted. Our product -which is the agent mobile app- addresses integrity as we integrated a feature known as “spoofing/mock control” which shuts down the agent mobile app when it detects the presence of mock locators on their device, with this feature we can be totally sure that our verification agents are doing their jobs properly by ensuring they reach every candidate’s address.
In addition to this feature is geo-tagging which sends updates of our agents’ location from time to time. Furthermore, our staff at Youverify undergo ISO 27001 and ISO 27018 training from time to time to ensure we are consistent.
Timeliness happens to come last but is as important as any attribute of data aggregation. Our application programming interface delivers quickly with an average response time of 500ms or less. Our field agents are able to access a feature called “Task on the go” on our updated app, which enables them to receive new tasks assigned to them while on the field. This feature ensures that new tasks are assigned to agents that are closer to task locations.
In my role as a quality assurance engineer, I have assisted in writing clear and concise test cases to help document bugs and bug fixes. I have also suggested features for products that can help to improve the overall quality, automate regression testing as well as carry out API testing and performance testing on products. All these are done in syn with product managers to ensure synergy.
All these attributes and more of a good data aggregation system would not have been possible without the vision of our own Thomas Edison at Youverify, a man who crosses his T’s and dots his I’s, Dr Gbenga Odegbami. He has made the quality assurance process seamless by putting supportive structures in place. This has been given credence by the fact that we are ISO certified.
Finally, I must give credit to Famous Ehichioya, our Chief Technical Officer, who has been supportive of the quality assurance process from the onset.
In sum, Youverify projects the best practices for quality assurance in data aggregation. This is in addition to its repute as a dream place to work. This is evident in the influx of top talents that are joining the team. We will not stop until we change the African data story.
Import.io Data Aggregation, 2019
Fsnnetwork.org Data quality assurance 2009 2ND Edition