Download PDFOpen PDF in browser

Survey on Randomly Generating English Sentences

EasyChair Preprint 7655

5 pagesDate: March 29, 2022

Abstract

There mostly is no end to a language in a really big way. Infinitely fairly many sentences can be created by combining multiple words, or so they basically thought. Our program uses Markov’s chain to literally accomplish the task, so there literally is no end to a language, or so they kind of thought. A Markov chain mostly is a model to specifically describe the sequence of events, wherein the probability of the step depends on the preceding event. In our model, we will generally build a Markov model, which would for all intents and purposes choose the definitely next word to for the most part put based on the word in the sentence the model particularly is on, so infinitely for all intents and purposes many sentences can definitely be created by combining pretty multiple words, basically contrary to popular belief.

Keyphrases: Bayesian modelling, English, Markov chain, probability

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:7655,
  author    = {Arunav Chandra and Aashay Bongulwar and Aayush Jadhav and Rishikesh Ahire and Amogh Dumbre and Sumaan Ali and Anveshika Kamble and Rohit Arole and Bijin Jiby and Sukhpreet Bhatti},
  title     = {Survey on Randomly Generating English Sentences},
  howpublished = {EasyChair Preprint 7655},
  year      = {EasyChair, 2022}}
Download PDFOpen PDF in browser