Google Cloud Dataflow
Google Cloud Dataflow is a fully managed service for executing Apache Beam pipelines within the Google Cloud Platform ecosystem.
History
Google Cloud Dataflow was announced in June, 2014[1] and released to the general public as an open beta in April, 2015.[2] In January, 2016 Google donated the underlying SDK, the implementation of a local runner, and a set of IOs (data connectors) to access Google Cloud Platform data services to the Apache Software Foundation.[3] The donated code formed the original basis for the Beam project.
gollark: I hope it's not a fancy `rm -rf /` wrapper or something.
gollark: Nope.
gollark: ```javascriptserver.use(session({ secret: "SESSION SECRET", cookie: { maxAge: 1000 * 60 * 60 * 24 * 7 // one week in milliseconds }, resave: false, saveUninitialized: false}))```
gollark: I don't know, I just set the max age of the cookie in my code, there's probably a way to server-side expire them.
gollark: I see. Well, it's slightly wrong.
References
- "Sneak peek: Google Cloud Dataflow, a Cloud-native data processing service". Google Cloud Platform Blog. Retrieved 2018-09-08.
- "Google Opens Cloud Dataflow To All Developers, Launches European Zone For BigQuery". TechCrunch. Retrieved 2018-09-08.
- "Google wants to donate its Dataflow technology to Apache". Venture Beat. Retrieved 2019-02-21.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.