• Natural Language Generation

  • 著者: Keelin M
  • ポッドキャスト

Natural Language Generation

著者: Keelin M
  • サマリー

  • The basics of natural language generation (NLG), based on the curriculum of CIS 5300 – and created with a little help from NotebookLM!

    All episode cover images created with Flux.1-schnell

    続きを読む 一部表示

あらすじ・解説

The basics of natural language generation (NLG), based on the curriculum of CIS 5300 – and created with a little help from NotebookLM!

All episode cover images created with Flux.1-schnell

エピソード
  • Vector Space Models
    2025/02/16

    This week, we will continue our exploration of vector space semantics and embeddings. We'll begin the module by wrapping up word embeddings and discussing bias in vector space models. Then, we'll discuss a variety of goals that any representation of word meaning should aim to achieve. These six goals will help us understand different aspects of word meaning and the relationships of words with other words. Then, we'll pivot to a coding demo that will provide you with a hands on experience working with vector space models and see how word embeddings can be used to retrieve words with similar meaning and to solve word analogy tasks.

    続きを読む 一部表示
    17 分
  • Neural Language Models
    2025/02/21

    In this module, we'll take a look at neural network based language models, which, unlike the previous N-gram based language models that we looked at earlier, use word embedding based representations for their contexts. This allows them to make much better probabilistic prediction about the next word in a sequence, and they have become the foundation for large pre-trained language models like Chat GPT that have led to exciting innovations in the field of NLP.

    続きを読む 一部表示
    19 分
  • Vector Space Semantics
    2025/02/03

    In this module, we'll begin to explore vector space semantics in natural language processing. (This will continue into next week.) Vector space semantics are powerful because they allow us to represent words in a way that allows us to measure similarity between words and capture several other kinds of meaning. We'll start this module by exploring important concepts that underpin this topic, like the distributional hypothesis and term-by-document matrices, and then switch to cover a recent approach to vector space models called word embeddings

    続きを読む 一部表示
    21 分
activate_buybox_copy_target_t1

Natural Language Generationに寄せられたリスナーの声

カスタマーレビュー:以下のタブを選択することで、他のサイトのレビューをご覧になれます。