Klara Kedem

Klara Kedem is an Israeli computer scientist, a professor of computer science at Ben-Gurion University in Beer-Sheva, Israel[1] and an adjunct faculty member in computer science at Cornell University in Ithaca, New York.[2]

Kedem received her Ph.D. in 1989 from Tel Aviv University, under the supervision of Micha Sharir.[3] Her most well-cited research publications are in computational geometry, and concern problems of shape comparison,[ACH] motion planning,[KLP] and Voronoi diagrams.[HKS] She has also collaborated with philosophers and linguists on a project to decipher handwritten medieval Hebrew writings that had been overwritten in Arabic.[4]

Selected publications

HKS.Huttenlocher, Daniel P.; Kedem, Klara; Sharir, Micha (1993), "The upper envelope of Voronoĭ surfaces and its applications", Discrete & Computational Geometry, 9 (3): 267–291, doi:10.1007/BF02189323, MR 1204784, Zbl 0770.68111, EuDML 131248
ACH.Arkin, Esther M.; Chew, L. Paul; Huttenlocher, Daniel P.; Kedem, Klara; Mitchell, Joseph S. B. (1991), "An Efficiently Computable Metric for Comparing Polygonal Shapes" (PDF), IEEE Trans. Pattern Anal. Mach. Intell., 13 (3): 209–216, doi:10.1109/34.75509, Zbl 0800.68949
KLP.Kedem, Klara; Livné, Ron; Pach, János; Sharir, Micha (1986), "On the union of Jordan regions and collision-free translational motion amidst polygonal obstacles", Discrete & Computational Geometry, 1 (1): 59–71, doi:10.1007/BF02187683, MR 0824108, Zbl 0594.52004, EuDML 130981
gollark: Instead of recomputing the embeddings every time a new sentence comes in.
gollark: The embeddings for your example sentences are the same each time you run the model, so you can just store them somewhere and run the cosine similarity thing on all of them in bulk.
gollark: Well, it doesn't look like you ever actually move the `roberta-large-mnli` model to your GPU, but I think the Sentence Transformers one is slow because you're using it wrong.
gollark: For the sentence_transformers one, are you precomputing the embeddings for the example sentences *then* just cosine-similaritying them against the new sentence? Because if not that's probably a very large bottleneck.
gollark: sentence_transformers says you should be able to do several thousand sentences a second on a V100, which I'm pretty sure is worse than your GPU. Are you actually running it on the GPU?

References

  1. Faculty listing, Computer Science, Ben-Gurion University, retrieved 2012-09-30.
  2. Faculty listing, Cornell University, retrieved 2012-09-30.
  3. Klara Kedem at the Mathematics Genealogy Project
  4. Fisher, Hannah (August 12, 2009), "Algorithms help unravel the secrets of ancient documents. B-G University project discovers Hebrew prayers under Arabic lettering", Jerusalem Post, archived from the original on October 11, 2014.


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.