Elizaveta Litvinova

Elizaveta Fedorovna Litvinova (1845–1919?) was a Russian mathematician and pedagogue. She is the author of over 70 articles about mathematics education.

Early life and education

Born in 1845 in czarist Russia as Elizaveta Fedorovna Ivanshkina, she completed her early education at a women's high school in Saint Petersburg. In 1866 Elizaveta married Viktor Litvinov, which, unlike Vladimir Kovalevskii (Sofia Kovalevskaya's husband), would not allow her to travel to Europe to study at the universities there. Thus, Litvinova started to study with Strannoliubskii, who had also privately tutored Kovalevskaya.

In 1872, as soon as her husband died, Litvinova went to Zürich and enrolled at a polytechnic institute. In 1873 the Russian czar decreed all Russian women studying in Zürich had to return to Russia or face the consequences. Litvinova was one of the few to ignore the decree and she remained to continue her studies, earning her baccalaureate in Zürich in 1876 and her doctoral degree in 1878 from the University of Bern.

Career and later life

When Litvinova returned to Russia, she was denied university appointments because she had defied the 1873 recall. She taught at a women's high school and supplemented her meager income by writing biographies of more famous mathematicians such as Kovalevskaya and Aristotle. After retiring, it is believed that Litvinova died during the Russian Revolution in 1919.

Bibliography

  • A. H. Koblitz, Sofia Vasilevna Kovalevskaia in Louise S. Grinstein (Editor), Paul J. Campbell (Editor) (1987). Women of Mathematics: A Bio-Bibliographic Sourcebook. Greenwood Press, New York. ISBN 978-0-313-24849-8.CS1 maint: extra text: authors list (link)

This article incorporates material from Elizaveta Litvinova on PlanetMath, which is licensed under the Creative Commons Attribution/Share-Alike License.

gollark: https://www.sbert.net/examples/applications/semantic-search/README.html is kind of like what you want.
gollark: Instead of recomputing the embeddings every time a new sentence comes in.
gollark: The embeddings for your example sentences are the same each time you run the model, so you can just store them somewhere and run the cosine similarity thing on all of them in bulk.
gollark: Well, it doesn't look like you ever actually move the `roberta-large-mnli` model to your GPU, but I think the Sentence Transformers one is slow because you're using it wrong.
gollark: For the sentence_transformers one, are you precomputing the embeddings for the example sentences *then* just cosine-similaritying them against the new sentence? Because if not that's probably a very large bottleneck.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.