『Model Context Protocol』のカバーアート

Model Context Protocol

Model Context Protocol

無料で聴く

ポッドキャストの詳細を見る

このコンテンツについて

This week we talk about the Marshall Plan, standardization, and USB.We also discuss artificial intelligence, Anthropic, and protocols.Recommended Book: Fuzz by Mary RoachTranscriptIn the wake of WWII, the US government implemented the European Recovery Program, more commonly known as the Marshall Plan, to help Western Europe recover from a conflict that had devastated the afflicted countries’ populations, infrastructure, and economies.It kicked off in April of 1948, and though it was replaced by a successor program, the Mutual Security Act, just three years later in 1951—which was similar to the Marshall Plan, but which had a more militant, anti-communism bent, the idea being to keep the Soviets from expanding their influence across the continent and around the world—the general goal of both programs was similar: the US was in pretty good shape, post-war, and in fact by waiting to enter as long as it did, and by becoming the arsenal of the Allied side in the conflict, its economy was flourishing, its manufacturing base was all revved up and needed something to do with all the extra output capacity it had available, all the resources committed to producing hardware and food and so on, so by sharing these resources with allies, by basically just giving a bunch of money and assets and infrastructural necessities to these European governments, the US could get everybody on side, bulwarked against the Soviet Union’s counterinfluence, at a moment in which these governments were otherwise prone to that influence; because they were suffering and weaker than usual, and thus, if the Soviets came in with the right offer, or with enough guns, they could conceivably grab a lot of support and even territory. So it was considered to be in everyone’s best interest, those who wanted to keep the Soviet Union from expanding, at least, to get Europe back on its feet, posthaste.So this program, and its successor program, were highly influential during this period, and it’s generally considered to be one of the better things the US government has done for the world, as while there were clear anti-Soviet incentives at play, it was also a relatively hands-off, large-scale give-away that favorably compared with the Soviets’ more demanding and less generous version of the same.One interesting side effect of the Marshall Plan is that because US manufacturers were sending so much stuff to these foreign ports, their machines and screws and lumber used to rebuild entire cities across Europe, the types of machines and screws and lumber, which were the standard models of each in the US, but many of which were foreign to Europe at the time, became the de facto standard in some of these European cities, as well.Such standards aren’t always the best of all possible options, sometimes they stick around long past their period of ideal utility, and they don’t always stick, but the standards and protocols within an industry or technology do tend to shape that industry or technology’s trajectory for decades into the future, as has been the case with many Marshall Plan-era US standards that rapidly spread around the world as a result of these giveaways.And standards and protocols are what I’d like to talk about today. In particular a new protocol that seems primed to shape the path today’s AI tools are taking.—Today’s artificial intelligence, or AI, which is an ill-defined type of software that generally refers to applications capable of doing vaguely human-like things, like producing text and images, but also somewhat superhuman things, like working with large data-sets and bringing meaning to them, are developing rapidly, becoming more potent and capable seemingly every day.This period of AI development has been in the works for decades, and the technologies required to make the current batch of generative AI tools—the type that makes stuff based on libraries of training data, deriving patterns from that data and then coming up with new stuff based on the prompting of human users—were originally developed in the 1970s, but the transformer, which was a fresh approach to what’s called deep learning architectures, was first proposed in 2017 by a researcher at Google, and that led to the development of the generative pre-trained transformer, or GPT, in 2018.The average non-tech-world person probably started to hear about this generation of AI tools a few years later, maybe when the first transformer-based voice and image tools started popping up around the internet, mostly as novelties, or even more likely in late-2022 when OpenAI released the first version of ChatGPT, a generative AI system attached to a chatbot interface, which made these sorts of tools more accessible to the average person.Since then, there’s been a wave of investment and interest in AI tools, and we’ve reached a point where the seemingly obvious next-step is removing humans from the loop in more AI-related processes.What that means in ...

Model Context Protocolに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。