(The Wall Street Journal) YouTube has instituted many changes over the past year to limit the problematic videos it recommends to viewers. A new study suggests the repairs have a way to go. Software nonprofit Mozilla Foundation found that YouTube’s powerful recommendation engine continues to direct viewers to videos that they say showed false claims and sexualized content, with the platform’s algorithms suggesting 71% of the videos that participants found objectionable. The study highlights the continuing challenge Alphabet Inc. subsidiary YouTube faces as it tries to police the user-generated content that turned it into the world’s leading video service. It...