Giving Compass' Take:

• Fast Company reports on a recent conference where CNN's Van Jones discussed how jobs developing artificial intelligence may help young people of color in low-income communities.

• What can we do to make sure these youth get the education and training needed for AI and other fields in the tech industry? The first step is to make sure that STEM programs are robust.

• Here's how one initiative extends the power of AI to high school girls.


Van Jones is posing for selfies with young fans after a conference last week in Mountain View, California, on artificial intelligence and employment opportunities. I’m supposed to interview him, once he can break away. But every time he tries, another earnest face turns to him, and he’s pulled back for one more photo.

Maybe best known as an analyst and host on CNN, the Oakland-based Jones says his primary job is being volunteer president of Dream Corps — a multifaceted organization promoting job opportunities for urban and minority youth, prison population reduction, and political dialogue. “Our job is to close prison doors and open doors of opportunity,” says Jones, always ready with a catchy turn of phrase — as well a “huh!” or guffaw for emphasis.

The July 17 conference, the Summit on Artificial Intelligence and Its Impact on Communities, kicked off a collaboration between Dream Corps and AI4ALL — a youth education program founded by artificial intelligence pioneer (and Chinese immigrant) Fei-Fei Li. The head of AI for Google Cloud, Li also runs Stanford University’s AI Lab and Vision Lab. In 2015, she started an AI summer camp for high school girls at Stanford, which grew into AI4ALL. Formalized as a nonprofit organization in 2017, it now operates out of six universities — targeting girls and minority and low-income children.

AI4ALL is a logical partner for Dream Corps, which teaches programming to underrepresented communities through its #YesWeCode program. AI education also matters to criminal justice reform. A 2016 ProPublica study, for instance, found that faulty data and analysis developed a sentencing algorithm that was horribly unreliable at assessing a defendant’s likelihood of recidivism. Further, it mislabeled blacks almost twice as often as it mislabeled whites.

Read the full article about AI jobs as a route out of poverty by Sean Captain at fastcompany.com.