- No team even came close to finding and fixing them all, but by combining the best results from each team, the test code was 100 per cent patched by the end of the competition.
- In qualifying heats last year, 131 pieces of software were examined by AI rivals to find 590 software flaws that DARPA knew about.
- Software must automatically find bugs in rival code.
- The agency has put up $2m in prize money in the unlikely event of a team building a system that can not only find flaws but write its own patches and deploy them without crashing.
- Once the competition is over all, the teams’ code – and DARPA’s test code – will be put online in perpetuity under an open-source license.
Read the full article, click here.
@cybersecboardrm: “White hat hacker #AI bots prepare for DARPA’s DEF CON cyber brawl #cybersecurity #infosec”
The research wing of the US military has picked the seven teams who will compete to build machine-learning software that can find and patch bugs automatically to fend off hackers.
White hat hacker AI bots prepare for DARPA’s DEF CON cyber brawl • The Register