Africa Yearbook

The Africa Yearbook is an annual publication devoted to politics, economy and society south of the Sahara. It is the successor to the German-language Afrika Jahrbuch published by the Institut für Afrika-Kunde in Hamburg, which issued its last yearbook in 2004 (on the year 2003).[1]

Africa Yearbook
DisciplineAfrica
LanguageEnglish
Publication details
Publisher
Brill Publishers (Netherlands)
Standard abbreviations
ISO 4Afr. Yearb.
Indexing
ISSN1871-2525 (print)
1872-9037 (web)

Scope

The yearbook covers major domestic political developments, the foreign policy and socio-economic trends in sub-Sahara Africa – all related to developments in one calendar year. The Africa Yearbook contains articles on all sub-Saharan states, each of the four sub-regions (West, Central, Eastern, Southern Africa) focusing on major cross-border developments and sub-regional organizations as well as one article on continental developments and one on European-African relations.

While the articles have thorough academic quality, the Yearbook is mainly oriented to the requirements of a large range of target groups: students, politicians, diplomats, administrators, journalists, teachers, practitioners in the field of development aid as well as business people.[2]

The Africa Yearbook received the Conover-Porter Award 2012 (best africana bibliography or reference work).[3]

gollark: It's kind of obsoleted by NVMe disks now.
gollark: Does it print `done!` at least?
gollark: I think the issue is that your `bot` is separate from the `client` and never actually started up.
gollark: ·Oh, `bot.run`, right.
gollark: ```pythonfrom transformers import GPT2LMHeadModel, GPT2Tokenizerimport discord.extfrom discord.ext import commandsTOKEN = 'NOT TELLING YOU'bot = commands.Bot(command_prefix='$')@bot.eventasync def on_ready(): print("done!")@bot.command()async def test(ctx, arg): inputs = arg # initialize tokenizer and model from pretrained GPT2 model tokenizer = GPT2Tokenizer.from_pretrained('gpt2') model = GPT2LMHeadModel.from_pretrained('gpt2') outputs = model.generate( inputs, max_length=200, do_sample=True, temperature=1, top_k=50 ) response = (tokenizer.decode(outputs[0], skip_special_tokens=True)) await ctx.send(response)client.run(TOKEN)```

See also

References

  1. "Archived copy". Archived from the original on 2011-07-21. Retrieved 2011-05-31.CS1 maint: archived copy as title (link) Retrieved 30 May 2011.
  2. "Archived copy". Archived from the original on 2010-03-10. Retrieved 2010-01-30.CS1 maint: archived copy as title (link) Retrieved 30 January 2010.
  3. "List of winners of the Conover Porter Award (accessed Jan.22, 2014)". Archived from the original on 2019-10-07. Retrieved 2014-01-22.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.