Amazon Algorithm Recommends Bomb Ingredients to British Customers After Terror Attack

The site's "frequently bought together" section recommended dangerous items to users following the Parsons Green terror attack in London.

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

Amazon (AMZN)’s “frequently bought together” recommendations normally suggest innocuous items to complement user purchases. But a British investigation has found that the online retailer’s algorithm may actually be encouraging online extremism.

According to a report from U.K. news station Channel 4, some Amazon customers in Britain got recommendations for bomb-making ingredients while shopping on the site. This revelation comes only five days after a crude explosive, wrapped in a plastic bag concealed in a bucket, exploded at London’s Parsons Green subway station and injured 29 people. It was the fifth major terror attack in England this year.

Amazon customers buying innocuous products like cooking ingredients received “Frequently bought together” prompts for items like ball bearings, which could be combined with the cooking products and used as shrapnel for an explosive device. Other suggested items included thermite, a pyrotechnic combination of metal powder and metal oxide.

Britain has no laws at the moment to prevent this problem. While the government has issued guidance to business owners on buying and selling explosives and restricted chemicals, these rules don’t yet apply to online stores like Amazon.

Members of Parliament want to close this loophole, however.

“Amazon needs to recognize that it has some social responsibility,” Yvette Cooper MP, chair of Parliament’s Home Affairs Committee, told Channel 4. “The idea that a company like this could be making it easier for people to put together dangerous explosives is very shocking.”

A tube train in London following the Parsons Green terror attack. Chris J. Ratcliffe/Getty Images

In a statement, Amazon said it was “reviewing” its website to ensure that all products “are presented in an appropriate manner.”

“We will work closely with the police and law enforcement agencies should circumstances arise where we can assist investigations,” the statement continued.

This isn’t the first time Amazon’s algorithm has been criticized in Britain. The Sun newspaper found that bomb-making manuals detailing “simple yet powerful” explosive formulas were available on Amazon days after the Manchester terror attack. The books were eventually removed from the site.

The Amazon revelations come at a time of increased scrutiny of website algorithms. Google (GOOGL) and Facebook (META) have come under fire for allowing advertisers to direct ads to users who searched for racist sentiments like “Jew hater” or “black people ruin neighborhoods.” Facebook also allowed advertisers to exclude certain races from housing and employment ads. Both companies pledged to restrict how advertisers targeted their audiences in response to these revelations.

Facebook has been criticized further because of its role in potentially influencing voters during the 2016 election. Earlier this month the site revealed that 470 bots based in Russia had purchased more than $100,000 worth of ads focused on divisive issues. These fake accounts have since been shut down.

Amazon Algorithm Recommends Bomb Ingredients to British Customers After Terror Attack