AI_cert

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Arkusz1

Natural language understanding (NLU)ᅠ


Natural language processing (NLP)
Named entity recognition (NER)
Deep learning
Einstein Botsᅠ
Einstein Agentᅠ
Einstein Discoveryᅠ
Einstein Vision for Field Serviceᅠ
Einstein Languageᅠ
GANs
Transformer models, like ChatGPT, (which stands for Chat Generative Pretrained Transformer),
While GANS and transformers are among the most popular generative AI models, several other techniques are used as w
GANs generator
GANs discriminator
Waht NLP uses to process and analyze text?
parsing
Parsing includes

Syntactic parsing may include:


Segmentation:
Tokenization:
Stemming:
Lemmatization:

Part of speech tagging:


Named entity recognition (NER):
Semantic parsing involves
several common analysis techniques that are used in NLP.
Sentiment analysis:
Intent analysis:
Context (discourse) analysis:
premortem

Strona 1
Arkusz1

refers to systems that handle communication between people and machines.


ᅠis distinct from NLU and describes a machineメs ability to understand what humans mean when they speak as they natur
labels sequences of words and picks out the important things like names, dates, and times. NER involves breaking apart a
refers to artificial neural networks being developed between data points in large databases. Just like our human mind conn
automatically resolve top customer issues, collect qualified customer information, and seamlessly hand off the customers
drives agent productivity across the contact center. Through intelligent case routing, automatic triaging, and case field pred
helps managers take action with predictive service KPIs. By serving up real-time analysis of drivers that impact KPIs, like c
automates image classification to resolve issues faster on-site. Just by taking a picture of the object, Einstein Vision can in
brings the power of deep learning to developers. They can use pretrained models to classify text by the sentiment as eithe
ᅠare made up of two neural networks: a generator and a discriminator.The two networks compete with each other
create outputs based on sequential data (like sentences or paragraphs) rather than individual data points. This approach h
such as variational autoencoders (VAEs), which also rely on two neural networks to generate new data based on sample d
generator creating an output based on some input. The generator then fine-tunes its output based on the discriminatorメs f
trying to determine if the output is real or fake.
uses algorithms and methods like large language models (LLMs), statistical models, machine learning, deep learning, and
involve breaking down text or speech into smaller parts to classify them for NLP
syntactic parsing, where elements of natural language are analyzed to identify the underlying grammatical structure, and s
Segmentation:
Tokenization:
Stemming:
Lemmatization:
Part of speech tagging:
Named entity recognition (NER):
Larger texts are divided into smaller, meaningful chunks. Segmentation usually occurs at the end of sentences at punctua
Sentences are split into individual words, called tokens. In the English language, tokenization is a fairly straightforward tas
Words are reduced to their root form, or stem. For example breaking, breaks, or unbreakable are all reduced to break. Ste
Similar to stemming, lemmatization reduces words to their root, but takes the part of speech into account to arrive at a mu
Assigns grammatical labels or tags to each word based on its part of speech, such as a noun, adjective, verb, and so on. P

Uses algorithms to identify and classify named entities–like people, dates, places, organizations, and so on–in text to help
analyzing the grammatical format of sentences and relationships between words and phrases to find the meaning represe
Sentiment analysis, Intent analysis, Context (discourse) analysis
Involves determining whether a piece of text (such as a sentence, a social media post, a review, or a tweet) expresses a p
Intent helps us understand what someone wants or means based on what they say or write. It’s like deciphering the purpo
Natural language relies heavily on context. The interpretation of a statement might change based on the situation, the deta
is the opposite of a post-mortem—it's an opportunity to catch the “what went wrong” before it happens.

Strona 2

You might also like