Yoke's Fresh Market

Yoke's Fresh Market is an employee-owned Spokane, Washington-based chain of grocery stores founded in 1946 by Marshall and Harriet Yoke. The chain was established by their son Chuck in the 1960s and now encompasses 17 stores in Washington and Idaho, primarily in the Spokane area. In 1990, Chuck sold the chain to the employees. John Bole currently directs company operations.[1]

Yoke's Fresh Markets
Grocery, employee-owned
IndustryRetail
Founded1946
HeadquartersSpokane, Washington
Number of locations
17
ProductsBakery, dairy, deli, frozen foods, general grocery, meat, produce, seafood, snacks, liquor
Websitehttp://www.yokesfreshmarkets.com

In recent years, the chain has expanded into the southeastern portion of Washington, with stores in Pasco, Kennewick, West Richland, and Richland.
The Liberty Lake Yoke's opened their doors on March 2, 2016 at the old Safeway/Haggen location.[2]
In September 2016, Yoke's Fresh Market expanded their stores total to 17 by acquiring Trading Company Stores locations in Cheney, Latah Creek (South Spokane), Spokane Valley, and Post Falls.[3]

Store Locations

Spokane, Washington locations

  • Airway Heights - 12825 W. Sunset Hwy.
  • Argonne - 9329 E. Montgomery Ave.
  • Cheney - 4 Cheney-Spokane Rd
  • Deer Park - 810 S. Main St.
  • Indian Trail - 3321 W. Indian Trail Rd.
  • Latah Creek - 4235 S. Cheney-Spokane Rd (west of US-195)
  • Liberty Lake - 1233 N. Liberty Lake Rd.
  • Mead - 14202 N. Market St.
  • North Foothills - 210 E. North Foothills Dr.
  • Sprague & McDonald - 13014 E. Sprague Ave
  • Sprague - 15111 E. Sprague Ave. Store was closed in late 2010.[4]
  • Wellesley - 4507 W. Wellesley Ave. Store was closed in September 2007.[5]

Washington: other locations

Idaho locations

gollark: Instead of recomputing the embeddings every time a new sentence comes in.
gollark: The embeddings for your example sentences are the same each time you run the model, so you can just store them somewhere and run the cosine similarity thing on all of them in bulk.
gollark: Well, it doesn't look like you ever actually move the `roberta-large-mnli` model to your GPU, but I think the Sentence Transformers one is slow because you're using it wrong.
gollark: For the sentence_transformers one, are you precomputing the embeddings for the example sentences *then* just cosine-similaritying them against the new sentence? Because if not that's probably a very large bottleneck.
gollark: sentence_transformers says you should be able to do several thousand sentences a second on a V100, which I'm pretty sure is worse than your GPU. Are you actually running it on the GPU?

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.