Does provenance matter when it comes to digital products and services?
Explainable AI is focused on how we make the results of any AI-based product or service transparent and understandable to humans.
For example, if the insurance premium for a car goes up based on some predictive model of traffic data run by insurance companies, customers may want to understand why. Arguments have been made to give consumers and customers of digital products and services more access to data relevant to (and about) them, and how that data is being used.
But the scope of this discussion is too narrow: we need explainable applications and services, no matter whether they use AI or "old-fashioned" procedural logic. This is often referred to as provenance, according to Wikipedia "the chronology of the ownership, custody or location of a historical object". While originally used to assess artwork, the term is increasingly used to trace data and digital artefacts.
Slowly, provenance is making its way into applications, but features like "why am I seeing this ad" from Google or Facebook are underwhelming. The question is: can we do better?
Automating laborious tasks
It is probably safe to assume that pressure from end-users and regulators to add provenance features to software systems will increase. Who wouldn't want to be confident that their personal data are safe and used in a meaningful way by DHBs when tracking medical treatments? Or why would banks purchase software if they cannot be certain that the software meets regulatory requirements of the financial sector.
Provenance may even offer valuable business opportunities to get ahead of the competition, but also to use provenance to (partially) automate existing laborious tasks such as audits to demonstrate regulatory compliance, or to analyse software supply chains in order to detect vulnerabilities. Offering such provenance also poses a risk as this may require significant engineering efforts and cause computational overheads in provenance-enabled software.
Veracity technology
We, a group of researchers at Victoria University of Wellington and the University of Canterbury, are looking into the provenance of digital products as part of a Science for Technological Innovation National Science Challenge project on veracity technology to ensure integrity of data and products.
We have started to implement a few demonstrators based on real-world scenarios and datasets that show how to integrate end-user oriented provenance features into current software systems. In a first step we hardcode provenance into those systems, as this will allow us to evaluate properties like usability and computational overhead. In the second step, we will develop techniques to automatically retrofit existing systems with provenance features by means of static program analysis, instrumentation and refactoring. Finally, we are planning to extend this approach to also support contestability.
Your participation needed!
At this early stage we also need to better understand the needs and practices of the technology sector regarding provenance. We have therefore created a short anonymous survey (10 minutes) and would very much appreciate your participation:
The ultimate goal of this project is to provide tools and infrastructures to help organizations collect, analyze, share and use provenance-related information during the development, maintenance, distribution and consumption of digital products and services.
For more information about the project, to provide feedback or to talk to us about collaboration, please contact us by email: Jens Dietrich ([email protected]) and Matthias Galster ([email protected]).
Jens Dietrich is an Associate Professor in the School of Engineering and Computer Science at the Victoria University of Wellington. He has a degree in Mathematics and a PhD in Computer Science, and worked in software consultancy in Germany, Switzerland, the UK and Namibia after graduating before moving to NZ to take up a position at Massey University in 2003. He moved to Victoria University in 2018. His research focuses on the automated detection of bugs and vulnerability in large real-world software systems. This work has been funded through several gifts and grants by Oracle Inc and the Science for Technological Innovation National Science Challenge.
Matthias Galster is an Associate Professor in the Department of Computer Science and Software Engineering at the University of Canterbury in Christchurch. He conducts research in the area of software engineering. He is particularly interested in studying and improving the way we develop high-quality software, including productive teams, processes and practices, and empirical software engineering. His focus is on software requirements engineering, software architecture, and software development processes and practices. His research has been funded through grants from the Ministry for Business, Innovation and Employment and the Science for Technological Innovation National Science Challenge.
Comments
You must be logged in in order to post comments. Log In