Welcome to Nural's newsletter where you will find a compilation of articles, news and cool companies, all focusing on how AI is being used to tackle global grand challenges.
Our aim is to make sure that you are always up to date with the most important developments in this fast-moving field.
Packed inside we have
- Microsoft automate the debugging of AI-generated code;
- Addressing gender bias in Wikipedia biographies;
- plus, latest updates from Climate Change AI;
If you would like to support our continued work from £1 then click here!
Graham Lane & Marcel Hedman
Key Recent Developments
Microsoft Jigsaw fixes bugs in machine-written software
What: Large language models such as GPT-3, Codex, and Google’s language model are capable of generating computer code from natural language specifications. The results have been encouraging but the quality of the output cannot be guaranteed, and mistakes are known to occur. The new Jigsaw product automates some of the human vetting that would be carried out by programmers, including checking that the code compiles, investigating error messages, feeding the program sample input and checking for expected output. It operates by taking as input an English description of the intended code together with an I/O example. If errors occur then corrective action is taken in post-processing that addresses the most common problems. The output is reviewed by a human with feedback to the pre- and post-processing functions.
Key Takeaways: A great deal of attention is being focused on AI coding not least because of the potential efficiency and financial gains. Salesforce have launched a new AI code generator and several startups in the recent Y-combinator cohort are dealing with this area.
Paper: Jigsaw: large language models meet program synthesis
Meta AI’s open-source system attempts to correct gender bias in Wikipedia biographies
What: Just 20% of Wikipedia biographies are about women. While there are a number of initiatives to address bias in existing content, this has not extended to “systemic challenges around the initial gathering of content and the factors that introduce bias in the first place”. A researcher has taken a step towards addressing this by developing an AI model that automatically generates first-draft Wikipedia biographies of women. She hopes that this will inspire editors to create more of the missing biographies. The model and associated data have been made open source by Facebook.
Key Takeaways: Firstly, the initiative highlights a problem that needs to be addressed and identifies scope for further work. Secondly, the work identifies that there is a big gap between the amount of material available in a few major languages on the web comparised to all other global languages. This can give rise to a lack of secondary sources needed to factually verify biographical entries.
Blog: Their stories should be celebrated: Using AI to deliver more inclusive biographical content on Wikipedia
Climate Change AI News
- CCAI webinars:
- April 7th (4-5 pm UTC) “Machine learning for oceans” will provide insights into machine learning for oceans and coral reef conservation. Register via Eventbrite
- May 6th “Meta-Learning: Climate impacts and adaption literature with ML” will focus on making sense of the ballooning climate impacts and adaption literature with Machine Learning. Register via Eventbrite
- Cornell University’s Center for Advanced Computing (CAC) is looking for climate-related projects that would require GPU resources to scale their impact. The submission deadline is April 30, 2022.
- AI Innovations 2022 will have a session on AI and Environment. This is a hybrid event with 2 venues for onsite participation: HTW Berlin & University of Exeter, and will take place on May 5-6, 2022.
🚀 How to help people understand AI
The author of a children’s book about AI provides a lucid explanation that is also suitable for adults. She concludes “What we should fear is not AI, but the misuse of AI by evil, power-hungry, or ignorant humans”.
🚀 Building trust with responsible AI
The article provides an accessible introduction about how to build trustworthy AI.
🚀 Problematic machine behaviour: A systematic literature review of algorithm audits
This research summary provides an overview of different approaches to the audit of algorithms, and the common issues that are identified.
Other interesting reads
🚀 The miserable lives of cyborg truck drivers
AI and automation may not "take" jobs, but they can make them worse than before.
🚀 Elon Musk says people might download their personalities onto the TeslaBot
Tesla Bot, now called Optimus, was announced in August 2021. Elon Musk has forecast "something pretty good at the prototype level" by the end of this year and that there might be "at least a moderate volume production" around the end of 2023. He said he thinks it's possible that people may one day be able to download their brain capacities into an Optimus.
🚀 Mercedes Drive Pilot beats Tesla Autopilot by taking legal responsibility
Mercedes will accept full legal responsibility for a vehicle whenever its Drive Pilot system is active. The system is already approved in Germany on certain (limited) types of road and it hopes to offer the system in the U.S.A. by the end of 2022. Drive Pilot offers full Level 3, meaning that drivers can take their hands off wheel and their eyes of the road. If successful, Mercedes will be the first company to offer full Level 3 in the U.S.A.
🚀AI beats eight world champions at bridge
Victory marks milestone for AI as bridge requires more human skills than other strategy games.
Cool companies found this week
Neptune.ai - provides a platform for ML experiment-tracking and a registry for models. This enables production teams to log, store, query, display, organize, and compare all model metadata in a single place. This interview with the CEO describes the types of problems the system is seeking to address.
AI Life - is an - apparently unreconstructed - all-male team of 11 who have set about creating a robot called Erica who should be “the most beautiful woman in the world” taking “Miss Universe pageant finalists” as their inspiration. The company claims the distinction of being “the first to cast and train an individual AI entity in a Motion Picture”. The aim is to create “entirely artificial and virtual human-like performances without the need for human interaction” so that “talents wouldn’t need to show up to appear in a movie”. This may not be “cool” but it is certainly egregious ...
Beamup.ai - makes digital twins for enterprise building management, promising design and operation of building systems on a single AI platform. The platform reduces system and infrastructure design time in areas such as security, IT, IoT and electrical. The company has attracted $15 million in seed funding.
And Finally ...
AI/ML must knows
Foundation Models - any model trained on broad data at scale that can be fine-tuned to a wide range of downstream tasks. Examples include BERT and GPT-3. (See also Transfer Learning)
Few shot learning - Supervised learning using only a small dataset to master the task.
Transfer Learning - Reusing parts or all of a model designed for one task on a new task with the aim of reducing training time and improving performance.
Generative adversarial network - Generative models that create new data instances that resemble your training data. They can be used to generate fake images.
Deep Learning - Deep learning is a form of machine learning based on artificial neural networks.
Nural Research Founder
If this has been interesting, share it with a friend who will find it equally valuable. If you are not already a subscriber, then subscribe here.
If you are enjoying this content and would like to support the work financially then you can amend your plan here from £1/month!