It’s not a question but rather thinking out loud.
I’ve got to the point when reading an average contemporary fiction book in French gives me on average 500 new words/expressions. I would love to continue to expand my knowledge of everyday words/idiomatic expressions but he problem is I rarely enjoy contemporary novels and, frankly speaking, I don’t have that much time for reading.
Alain Robbe-Grillet once wrote a novel Djinn specifically for learners of French (check it on wiki; it is structured in such a way that it reminds me the famous “Nature Method” of Lingua latina per se illustrata). There is also a group of experimental French writers Oulipo famous for their formal experiments (George Perec, for instance, wrote a novel La Disparition which doesn't use the letter e — the most frequent vowel in French).
All learners materials are centered around high frequency words. Novels on the other hand become 'ineffective' tools of vocabulary expansion (you read 80k words and only get 500 new words).
So this leads me to my utopian vision of a novel for language learners. What I have in mind is some kind Oulipo experiment for learning purposes: a novel (a long sequency of stories, sketches?) which would would methodically go through the frequency dictionary and try to connect them into a semblance of coherent narrative. (Suppose, every 15 min chapter has to introduce 50 new words and mention them at least 5-7 times under different contexts)
Imagine a book, which doesn't have 7-8k unique words, as an average novel does, but at least 20k ? That means that by relistening to a single audiobook, you would review the core vocabulary which, as studies show, has to be way bigger that 5k.
I'm aware that this is just a utopian vision but, damn, what a beautiful utopia it could be!
Oulipo approach to language learning
- einzelne
- Blue Belt
- Posts: 804
- Joined: Sat Mar 17, 2018 11:33 pm
- Languages: Russan (N), English (Working knowledge), French (Reading), German (Reading), Italian (Reading on Kindle)
- x 2884
- Keys
- Yellow Belt
- Posts: 92
- Joined: Sat Oct 24, 2015 1:54 am
- Location: Toronto
- Languages: Dutch (N), English (C2), German (C1), French (B2), Swedish (B2), Spanish (B2), Italian (B2), Russian (B2), Hungarian (B1), Polish (B1), Urdu (A2); reading literature and listening to audiobooks in Danish, Dutch, English, French, German, Hungarian, Indonesian, Italian, Polish, Portuguese, Russian, Swedish and Spanish. Studying Urdu, Polish atm.
- x 264
- Contact:
Re: Oulipo approach to language learning
That sounds very cool.
If you don't want to artificially write a whole new novel based on that idea, you could take an existing book, dumbify the first chapters by taking low frequency words out, so the book can be started by beginners and add say 15k of low frequency words in the rest of the book so you can learn a total of 20k words.
If you need to encounter those 15k in context 20 times to remember them, you'd need to pad the book with 300,000 words, maybe different conjugations, forms of the root, so it becomes less terrible to read or terribly hard to insert them. You'd also need quite the tome to begin with to insert so many words.
Inserting 300,000 words in an existing book would take prohibitively long I guess. Maybe encountering a word ten times is good enough, or maybe just re-reading the book. Or write something from scratch after all. I'd definitely buy it
It's like something I thought of ages ago and recently saw that exists, where words in one book are replaced with the target language. You start in English and end up in the foreign language, so slowly more and more words are replaced in the target language. I think I saw an application for that with Alice in Wonderland, or it might have been another classic work.
I like this idea better though as it solves the problem of slow language acquisition through reading as you say
If you don't want to artificially write a whole new novel based on that idea, you could take an existing book, dumbify the first chapters by taking low frequency words out, so the book can be started by beginners and add say 15k of low frequency words in the rest of the book so you can learn a total of 20k words.
If you need to encounter those 15k in context 20 times to remember them, you'd need to pad the book with 300,000 words, maybe different conjugations, forms of the root, so it becomes less terrible to read or terribly hard to insert them. You'd also need quite the tome to begin with to insert so many words.
Inserting 300,000 words in an existing book would take prohibitively long I guess. Maybe encountering a word ten times is good enough, or maybe just re-reading the book. Or write something from scratch after all. I'd definitely buy it
It's like something I thought of ages ago and recently saw that exists, where words in one book are replaced with the target language. You start in English and end up in the foreign language, so slowly more and more words are replaced in the target language. I think I saw an application for that with Alice in Wonderland, or it might have been another classic work.
I like this idea better though as it solves the problem of slow language acquisition through reading as you say
5 x
- zenmonkey
- Black Belt - 2nd Dan
- Posts: 2528
- Joined: Sun Jul 26, 2015 7:21 pm
- Location: California, Germany and France
- Languages: Spanish, English, French trilingual - German (B2/C1) on/off study: Persian, Hebrew, Tibetan, Setswana.
Some knowledge of Italian, Portuguese, Ladino, Yiddish ...
Want to tackle Tzotzil, Nahuatl - Language Log: viewtopic.php?f=15&t=859
- x 7032
- Contact:
Re: Oulipo approach to language learning
I like this.
It is also one of the reasons that I moved to music as I felt that vocabulary could be pretty rich in some genres.
This is why I like Aesop Rock to push my daughters.
For French, you might look into things like Fauve. (They come to mind because I was just listening to this.)
But back to books, there’s a lot of shorter paperback books from relatively varied authors that should deliver of the vocabulary diversity. From Series Noires, to Apollinaire to Camus ….
It is also one of the reasons that I moved to music as I felt that vocabulary could be pretty rich in some genres.
This is why I like Aesop Rock to push my daughters.
For French, you might look into things like Fauve. (They come to mind because I was just listening to this.)
But back to books, there’s a lot of shorter paperback books from relatively varied authors that should deliver of the vocabulary diversity. From Series Noires, to Apollinaire to Camus ….
4 x
I am a leaf on the wind, watch how I soar
- rdearman
- Site Admin
- Posts: 7259
- Joined: Thu May 14, 2015 4:18 pm
- Location: United Kingdom
- Languages: English (N)
- Language Log: viewtopic.php?f=15&t=1836
- x 23298
- Contact:
Re: Oulipo approach to language learning
I thought I knew that intro. I have it on a playlist somewhere.
2 x
: Read 150 books in 2024
My YouTube Channel
The Autodidactic Podcast
My Author's Newsletter
I post on this forum with mobile devices, so excuse short msgs and typos.
My YouTube Channel
The Autodidactic Podcast
My Author's Newsletter
I post on this forum with mobile devices, so excuse short msgs and typos.
- Iversen
- Black Belt - 4th Dan
- Posts: 4783
- Joined: Sun Jul 19, 2015 7:36 pm
- Location: Denmark
- Languages: Monolingual travels in Danish, English, German, Dutch, Swedish, French, Portuguese, Spanish, Catalan, Italian, Romanian and (part time) Esperanto
Ahem, not yet: Norwegian, Afrikaans, Platt, Scots, Russian, Serbian, Bulgarian, Albanian, Greek, Latin, Irish, Indonesian and a few more... - Language Log: viewtopic.php?f=15&t=1027
- x 15030
Re: Oulipo approach to language learning
If you wanted to design a book so that you really could learn many new words from it then the best strategy would probably be to use informative contexts - at least for those words that might be new to most accomplished readers. I suppose it could be done in an unobtrusive way, but probably not if the target segment were beginners. Or add footnotes, but for instance I own a pedagogical Russian history book with accents and word explanations -and the word explanations rarely deal with those words I found difficult when I read it, more often than not they explained those that I could guess without help.
Concerning density of new words: I just stumpled over a homepage with some statistics for Anglophone novels. The graphics for 'unique words' (or rather: wordforms including proper names) are slightly misleading because they suggest that "A tale of two cities" by Dickens is the winner and "The Lion (etc.)" by Lewis is the pitiful loser, but when you take the total number of words per book into account the lowest distance between unique words is found in "The adventures of Tom Sawyer (...)" by Mark Twain (every ninth is new) and the highest in "Sense and Sensibility" by Jane Austen (every sixteen-and-a-half'th). Twain is also the shortest, while Austen is the longest but one (only DIckens is longer) - and one and the same author is liable to repeat his/her own favorite words more often in a very long book. However those statistics don't tell where the words in question would range on a frequency list. And the counts must be for wordforms, not words - otherwise the numbers would be a good deal lower.
Apart from that the best way to find many unique words with explanations would be to read a dictionary (just for the fun of it).
That might also work (when used in conjunction with more traditional sources)
Concerning density of new words: I just stumpled over a homepage with some statistics for Anglophone novels. The graphics for 'unique words' (or rather: wordforms including proper names) are slightly misleading because they suggest that "A tale of two cities" by Dickens is the winner and "The Lion (etc.)" by Lewis is the pitiful loser, but when you take the total number of words per book into account the lowest distance between unique words is found in "The adventures of Tom Sawyer (...)" by Mark Twain (every ninth is new) and the highest in "Sense and Sensibility" by Jane Austen (every sixteen-and-a-half'th). Twain is also the shortest, while Austen is the longest but one (only DIckens is longer) - and one and the same author is liable to repeat his/her own favorite words more often in a very long book. However those statistics don't tell where the words in question would range on a frequency list. And the counts must be for wordforms, not words - otherwise the numbers would be a good deal lower.
Apart from that the best way to find many unique words with explanations would be to read a dictionary (just for the fun of it).
Keys wrote: something ....where words in one book are replaced with the target language. You start in English and end up in the foreign language, so slowly more and more words are replaced in the target language. I think I saw an application for that with Alice in Wonderland, or it might have been another classic work. I like this idea better though as it solves the problem of slow language acquisition through reading as you say
That might also work (when used in conjunction with more traditional sources)
3 x
- språker
- Yellow Belt
- Posts: 88
- Joined: Tue Oct 05, 2021 11:46 am
- Location: Vilnius, Lithuania
- Languages: Swedish (N), English (C1), German (B2), French (A2), Lithuanian (B1 -- studying)
- x 345
Re: Oulipo approach to language learning
This is an interesting idea! I am replying from a bit dry language learning perspective though.
Isn’t that about what “extensive reading” is aiming for? Even if knowing a lot of words is a cornerstone of knowing a language, just seeing them combined in different sentences is also important. It does not bother me that I know most words in texts in my native language, or even in English most of the time. I am learning in both, even if I am not studying.
I suppose that it is almost always possible to find a more advanced text that is challenging, even in a language you know quite well. If not, I would just be happy with my reading level, and start writing instead. If running alone doesn’t improve your running, doing yoga or weight lifting might.
einzelne wrote:All learners materials are centered around high frequency words. Novels on the other hand become 'ineffective' tools of vocabulary expansion (you read 80k words and only get 500 new words).
Isn’t that about what “extensive reading” is aiming for? Even if knowing a lot of words is a cornerstone of knowing a language, just seeing them combined in different sentences is also important. It does not bother me that I know most words in texts in my native language, or even in English most of the time. I am learning in both, even if I am not studying.
einzelne wrote:Imagine a book, which doesn't have 7-8k unique words, as an average novel does, but at least 20k ? That means that by relistening to a single audiobook, you would review the core vocabulary which, as studies show, has to be way bigger that 5k.
I suppose that it is almost always possible to find a more advanced text that is challenging, even in a language you know quite well. If not, I would just be happy with my reading level, and start writing instead. If running alone doesn’t improve your running, doing yoga or weight lifting might.
4 x
- einzelne
- Blue Belt
- Posts: 804
- Joined: Sat Mar 17, 2018 11:33 pm
- Languages: Russan (N), English (Working knowledge), French (Reading), German (Reading), Italian (Reading on Kindle)
- x 2884
Re: Oulipo approach to language learning
zenmonkey wrote:This is why I like Aesop Rock to push my daughters..
Oh, Aesop Rock, my favorite one, the Herman Melville of hip-hop!
zenmonkey wrote:But back to books, there’s a lot of shorter paperback books from relatively varied authors that should deliver of the vocabulary diversity. From Series Noires, to Apollinaire to Camus ….
It's the same here. Statistical distribution works here as well: shorter books -> less new vocabulary. In fact, if I just want to read for pleasure, I take a random Mondiano book (they are all around 30k)
0 x
- einzelne
- Blue Belt
- Posts: 804
- Joined: Sat Mar 17, 2018 11:33 pm
- Languages: Russan (N), English (Working knowledge), French (Reading), German (Reading), Italian (Reading on Kindle)
- x 2884
Re: Oulipo approach to language learning
språker wrote:I suppose that it is almost always possible to find a more advanced text that is challenging, even in a language you know quite well.
Yes, it's easy. Take some classical or modernist novel and—boom!—you have it! But these words won't be as useful in everyday life as, say, words from some polar. I'm interested in expanding (and reviewing) such words and expressions. That's how I came up with my utopian idea of an OULIPO novel.
1 x
- einzelne
- Blue Belt
- Posts: 804
- Joined: Sat Mar 17, 2018 11:33 pm
- Languages: Russan (N), English (Working knowledge), French (Reading), German (Reading), Italian (Reading on Kindle)
- x 2884
Re: Oulipo approach to language learning
Iversen wrote:However those statistics don't tell where the words in question would range on a frequency list. And the counts must be for wordforms, not words - otherwise the numbers would be a good deal lower.
Indeed. And apart from the number of unique words there are other factors you have to take into account: the density of these words etc (Perseus has a nice statistics for some Classical texts: https://www.perseus.tufts.edu/hopper/help/vocab#size).
Yes, dictionaries can be an answer but there's a problem: we don't have good frequency dictionaries for 20k and up. Second, may be it's personal but for me it's always been easier to remember words in some kind of context (that's why short Assimil dialogues work so well for me). Random words, even sentences simply don't cut the mustard.
2 x
- zenmonkey
- Black Belt - 2nd Dan
- Posts: 2528
- Joined: Sun Jul 26, 2015 7:21 pm
- Location: California, Germany and France
- Languages: Spanish, English, French trilingual - German (B2/C1) on/off study: Persian, Hebrew, Tibetan, Setswana.
Some knowledge of Italian, Portuguese, Ladino, Yiddish ...
Want to tackle Tzotzil, Nahuatl - Language Log: viewtopic.php?f=15&t=859
- x 7032
- Contact:
Re: Oulipo approach to language learning
It's the same here. Statistical distribution works here as well: shorter books -> less new vocabulary. In fact, if I just want to read for pleasure, I take a random Mondiano book (they are all around 30k)
Not really, shorter books by different authors with different themes will bring you different vocabulary sets (obviously with strong overlaps).
If you really want to maximize the vocabulary variance - run your e-books through a frequency analysis (after you lemmatize with Spacy - for French) and choose the books that demonstrate the higher variance or automate sentence/paragraph mining from a list of keywords.
There is also Learning with Texts server version floating around somewhere here.
Edit: wow, saw some of the drama with LWT but happy to see it is back up and available.
3 x
I am a leaf on the wind, watch how I soar
Return to “General Language Discussion”
Who is online
Users browsing this forum: No registered users and 2 guests