• Training AI to read animal facial expressions, NIH funding takes a big hit, and why we shouldn’t put cameras in robot pants

  • 2025/02/13
  • 再生時間: 40 分
  • ポッドキャスト

Training AI to read animal facial expressions, NIH funding takes a big hit, and why we shouldn’t put cameras in robot pants

  • サマリー

  • First up this week, International News Editor David Malakoff joins the podcast to discuss the big change in NIH’s funding policy for overhead or indirect costs, the outrage from the biomedical community over the cuts, and the lawsuits filed in response. Next, what can machines understand about pets and livestock that humans can’t? Christa Lesté-Lasserre, a freelance science journalist based in Paris, joins host Sarah Crespi to discuss training artificial intelligence on animal facial expressions. Today, this approach can be used to find farm animals in distress; one day it may help veterinarians and pet owners better connect with their animal friends. Finally, Keya Ghonasgi, a postdoctoral fellow at the Georgia Institute of Technology, talks about a recent Science Robotics paper on the case against machine vision for the control of wearable robotics. It turns out the costs of adding video cameras to exoskeletons—such as loss of privacy—may outweigh the benefits of having robotic helpers on our arms and legs. This week’s episode was produced with help from Podigy. About the Science Podcast Authors: Sarah Crespi; Christa Lesté-Lasserre; David Malakoff Learn more about your ad choices. Visit megaphone.fm/adchoices
    続きを読む 一部表示

あらすじ・解説

First up this week, International News Editor David Malakoff joins the podcast to discuss the big change in NIH’s funding policy for overhead or indirect costs, the outrage from the biomedical community over the cuts, and the lawsuits filed in response. Next, what can machines understand about pets and livestock that humans can’t? Christa Lesté-Lasserre, a freelance science journalist based in Paris, joins host Sarah Crespi to discuss training artificial intelligence on animal facial expressions. Today, this approach can be used to find farm animals in distress; one day it may help veterinarians and pet owners better connect with their animal friends. Finally, Keya Ghonasgi, a postdoctoral fellow at the Georgia Institute of Technology, talks about a recent Science Robotics paper on the case against machine vision for the control of wearable robotics. It turns out the costs of adding video cameras to exoskeletons—such as loss of privacy—may outweigh the benefits of having robotic helpers on our arms and legs. This week’s episode was produced with help from Podigy. About the Science Podcast Authors: Sarah Crespi; Christa Lesté-Lasserre; David Malakoff Learn more about your ad choices. Visit megaphone.fm/adchoices

Training AI to read animal facial expressions, NIH funding takes a big hit, and why we shouldn’t put cameras in robot pantsに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。