It is being increasingly understood that systems living in an ecosystem and functioning in silos cannot constructively work towards bigger objective of single window fraud resilient systems. In this article we present references to e-governance projects who have adopted data strategy coupled with institutional and right technology framework to explore better delivery for their citizens.
Some case studies in implementing an analytics project are presented for reference:
Smart City London
Transport for London (TfL) is the leading agency responsible for providing transport facility to Londoners. TfL has adopted a data driven approach of Service Delivery to citizens. The data is collected daily from the users of TfL. Key decisions that TfL has taken in this direction –
- Institutional set-up: The mayor of London spear headed the smart city project engaging Entrepreneurs, academicians and technical experts in setting up a body for giving directions to future of transport in London.
- TfL forged partnership with leading academic institution such as MIT, Oxford to push novel idea in the development of city.
- Opening up data for collaboration: The data that TfL gathers is a public data, and that loop is completed by making that available to
- Investment in data gathering and analytics: TfL has deployed sensors and large systems such as SAP HANA to gather and analyze data.
- In-house IT and Intelligence team: The analysis and inference requires significant domain expertise. In-house IT and intelligence team ensure that optimal intelligence is generated and continues to grow.
“Department’s leadership and incessant follow up is critical to success of data integration project. Future of e-governance landscape cannot be driven with out robust data integration”
Catching frauds and evasion in Taxation
1. Standardization of meta-data: Fraudsters often misreport data. To check and verify data from various parties, standardization of meta-data is required across all bodies collecting government data.
2. Multi and cross referencing of data through integration with third parties: Integrate with all prominent bodies collecting government data for referencing and validation.
3. Centralized, government run and government controlled system: The system where analytics is to be run should be centralized. It enables speedier intelligence and 360 degree view of system.
4. Free verification platform: Expose verification services to external world, it will help other parties to ensure data consistency and provide a mechanism to cross-check fraudsters.
5. Data gathered and exchanged to give indicators on monitoring and auditing: The analysis don on the data when found in red should also give signals about frauds to aid in monitoring and evaluation activities.
1. Relentless follow-up: While having the data tools and techniques is a good idea, but action and consistent follow-up and support from top management could make it a wasted investment. It is imperative to respond to all intelligence coming from data.
2. Implementing effective tactics: The short term approach coupled with on-ground human intelligence is a good approach. After all not everything can be derived and assured by technology.
3. Rapid response: The intelligence blossoming out of data mining activities have a life. Actions not taken in time may ruin all the utility of intelligence.
4. Availability of timely and accurate information: The knowledge emanating from system is of some use only when it accurate and is ready for use at the right time.
5. Data visualization techniques: Visuals are more understandable and appealing way of representing information.
6. Domain Awareness System: Certain analytics function without inputs from domain experts is of no value. A system exists in an ecosystem where it works with other system. Therefore a background of domain along with technology is desired.
While formulating the approach due consideration to Institutional and Technical arrangement may also be given.