Data Collection For AI Solves Problems, Helps Researchers, Panellists Tell UN-Led Event 17/05/2018 by Catherine Saez, Intellectual Property Watch Leave a Comment Share this:Click to share on Twitter (Opens in new window)Click to share on LinkedIn (Opens in new window)Click to share on Facebook (Opens in new window)Click to email this to a friend (Opens in new window)Click to print (Opens in new window)At a time when data collection has become a prickly subject and public defiance against large data-collecting companies such as Facebook, Google or Amazon has risen, a UN-led international summit on artificial intelligence this week sought to present the potential of the new technology in solving global problems. Data is the basic fuel of artificial intelligence, and panellists at the event showed how data collection has led to problem solving. For instance, the Chan-Zuckerberg Initiative hopes to give biomedical researchers the ability to tap into the global conversation and browse some 200 years of research. Data for good panel at AI for Good Global Summit The AI for Good Global Summit is taking place at the UN International Telecommunication Union (ITU) from 15-17 May. The summit is organised by the ITU in partnership with XPrize Foundation and a number of other organisations, including many United Nations agencies. Sam Molyneux, general manager of the Chan Zuckerberg Initiative, co-founded by Facebook creator Mark Zuckerberg and his wife Priscilla Chan in December 2015, described the efforts of the initiative in the field of biomedicine. The initiative builds tools for scientific researchers, on the belief that most progress in society, in medicine and toward the UN Sustainable Development Goals comes from science or is rooted in scientific research, he said. Two major issues underlie scientific information there are two major issues, according to Molyneux. One is knowledge complexity, he said, explaining that today it is “tremendously difficult” to read enough articles and store the complex knowledge, in particular in biomedicine, in the human brain. The other issue is research awareness, according to Molyneux. With some 4,000 new papers published every day in biomedicine, and the exponential increase in the production of citable entities, the speed of science, and the speed of progress, researchers need powerful basic tools to be able to tap into the global conversation and handle the knowledge complexity, he said. At the initiative, “we believe that tools like AI [artificial intelligence] can be applied at scale in the context of philanthropy to build a product and give it away for free for the global good,” he said, adding that the initiative supports the goal of seeing all diseases curable, preventable, and manageable by the end of the century. That is an audacious goal which requires all scientists’ efforts, and requires helping the entire ecosystem go faster, he said. “We imagine a ‘Spotify-like experience’ for research … where you can stream research, tap into the conversation, discover arbitrary intersections or subfields of research instantly and are able to browse the entire landscape of discovery in the sciences” over the last 200 years, he explained. Molyneux said the initiative has been using AI and working with scientific publishers to develop a tool over the last eight months with the aim of producing an evolving map of scientific knowledge. The tool developed with the best talents in Silicon Valley “and elsewhere,” involves predictive models to anticipate the impact of research the moment it is published, for example, he said. In the context of philanthropy, this is a tool “we built for free, for everyone, it is designed for biomedical researchers at heart but is available for everyone else as well.” In terms of advice for such projects, he suggested to developers to “go for the ultimate data set,” and warned that AI and data do not necessarily equal value, as the dataset produced might not meet the adequate quality, might give false negatives, false positives, and ultimately result in having a negative impact. World Bank: AI Helps Reconstruct Transportation System According to Edward Hsu, senior advisor, Office of the President at the World Bank Group, one of the questions is how to harness the benefit of disruptive technology for all, particularly poor and developing countries. Some World Bank projects use big data and machine learning, and analytics, he said. One of those projects is in Haiti, on the rebuilding of the transportation system after the earthquake, he said. Haiti is a data poor environment and it was hard to find where people live and work because many of the jobs are informal, he explained. In many poor communities, people have very long commute to work and can spend up to 70 percent of their income on transportation, he added. According to Hsu, a thorough analysis of anonymised data was used to track how people get to work, which led to the establishment of a public transport improvement plan using buses. For the city center, the average commuter accessibility was increased by 52 percent in terms of time and cost, he said. This example illustrates that even in poor data environment, data can actually be collected and can be used to refine the discussion, he said. Lack of Data Available In Developing Countries Nagla Rizk, professor of economics at the American University in Cairo, and founding director of Access to Knowledge for Development Center (A2K4D) there, said data is a prerequisite of technology for the global good. She underlined the persistent issue of the digital divide in developing countries, and the lack of available data, which are often scattered, held by large companies or governments. This lack of available data makes it impossible to provide material for AI technology applications, she said. There is a need for collecting organic data on the ground, collected in innovative ways, with a mix of technologies, she said. Rizk also highlighted the challenge of the asymmetry between encouraging economic freedom and data-driven innovation, and on the other hand temptation to curb civil liberties. She also stressed the need for an enabling environment, capacity building, and an interdisciplinary aspect of AI. Big Data Complementing Traditional Metrics Silja Baller, practice lead, Digital Economy and Innovation at the World Economic Forum, mentioned challenges and opportunities at the intersection of economic policymaking and new technology. She mentioned the efforts of the World Economic Forum to create a common language around issues, converge conversations when the debate is very polarised, help build partnerships when the debate is not polarised, and catalyse new research partnerships. She underlined the fact that big data is helping policymakers by complementing traditional metrics. Big data, which have a much greater granularity and a reach into the informal sector – which is important in the case of gaps in national statistics – can help draw a more accurate picture of entire economic and social ecosystems, she said. Image Credits: Catherine Saez Share this:Click to share on Twitter (Opens in new window)Click to share on LinkedIn (Opens in new window)Click to share on Facebook (Opens in new window)Click to email this to a friend (Opens in new window)Click to print (Opens in new window) Related Catherine Saez may be reached at email@example.com."Data Collection For AI Solves Problems, Helps Researchers, Panellists Tell UN-Led Event" by Intellectual Property Watch is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.