Unsane (album)

Unsane is Unsane's debut album, released in 1991 through Matador Records. It is the only studio album by the group to feature founding member Charlie Ondras, who died of a heroin overdose during the 1992 New Music Seminar in New York during the tour supporting the album.[2] The album's cover art, depicting a decapitated corpse on subway tracks, was given to the band from a friend who worked on the investigation for the case.[3]

Unsane
Studio album by
ReleasedNovember 26, 1991
RecordedJanuary 16, 1991
StudioFun City (New York City, New York)
GenreNoise rock,[1] post-hardcore
Length36:52
LabelMatador
ProducerWharton Tiers
Unsane
Unsane chronology
Unsane
(1991)
Singles 89–92
(1992)

Death metal band Entombed covered "Vandal-X" on their self-titled compilation album in 1997.

Reception

Professional ratings
Review scores
SourceRating
Allmusic[1]

Patrick Kennedy from Allmusic called it a brilliant and daring debut that "assaults the senses like the Swans or Foetus before them, but tempers that art-scum priggishness with clear roots in punk and classic rock."[1]

Track listing

All tracks are written by Unsane.

No.TitleLength
1."Organ Donor"2:10
2."Bath"2:54
3."Maggot"3:17
4."Cracked Up"2:57
5."Slag"2:43
6."Exterminator"5:55
7."Vandal-X"2:04
8."HLL."2:31
9."AZA-2000"2:33
10."Cut"2:48
11."Action Man"2:28
12."White Hand"4:26
Total length:36:52

Personnel

References

  1. Kennedy, Patrick. "Allmusic ((( Unsane > Review )))". Allmusic. Retrieved January 17, 2011.
  2. Jones, Brad. "Unsane in the Brain". Unsane Biography, October 1994. Retrieved March 31, 2011.
  3. Jagernauth, Kevin (2015-10-29). "Exclusive: Have A Religious Experience With Unsane In Clip From Amphetamine Reptile Doc 'The Color Of Noise'". indiewire.com. Indie Wire. Retrieved 2018-02-04.
gollark: Reinforcement learning is a field which exists, though.
gollark: The largest AIs around are just trained to predict the next token of text, which is very easy to test and gives good natural language understanding.
gollark: With how things are going, it seems entirely possible that you'd get something human-level in at least a few ways just by taking some current AI designs and scaling them up a few orders of magnitude.
gollark: We can make language models act "emotionally" right now, also.
gollark: That seems like a really bad definition.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.