Eberhard Blum

Eberhard Blum (April 28, 1919 – July 9, 2003), born in Kiel, was the fourth head of the German Federal Intelligence Bureau (BND). He served for the Wehrmacht on the Eastern front during World War II, last in the position of a Rittmeister. After the war he finished his university studies in law and state science[1] and in 1947 joined the Gehlen Organization, the precursor of the BND. He became personal consultant to Reinhard Gehlen under the codename HARTWIG.[2]

Eberhard Blum
Born(1919-04-28)28 April 1919
Kiel, Germany
Died9 July 2003(2003-07-09) (aged 84)
Stuttgart, Germany
Battles/warsWorld War II

From 1961 to 1964 he was head of the subdivision "Personnel" of the BND. 1964 to 1968 he became resident spy of the BND in London.

After the period in London he returned to Pullach as head of Department IV Administration (Abeitlung IV Verwaltung), a position he kept until 1970 when major disputes with the then BND president Gerhard Wessel arose and he was transferred as resident spy to Washington where he remained until 1982.

He overtook the position as head of the Bundesnachrichtendienst in 1982 and remained in office until 1985.

Blum died on July 9, 2003 in Stuttgart.

Footnotes

  1. "Ehemalige Präsidenten des Bundesnachrichtendienstes". Retrieved 13 December 2012.
  2. Jefferson, Adams (2009). Historical Dictionary of German Intelligence. Scarecrow Press.
Government offices
Preceded by
Klaus Kinkel
President of the Federal Intelligence Bureau
1982–1985
Succeeded by
Heribert Hellenbroich
gollark: It's quite strange that apparently BERT can be statically quantized without any extra training and retains decent accuracy but GPT-Neo emits nonsense going through the same process.
gollark: I was looking into quantization-aware training a while ago, but on the 125M model, and running that for a bit made it produce English-looking nonsense instead of random noise.
gollark: I think there's technically a way to swap bits of the model in and out of VRAM but it would still be quite slow.
gollark: You need a recent GPU with something like 16GB of VRAM.
gollark: There's probably documentation in the mesh-transformer-jax repo too.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.