Triplesplit Series, Vol. 1

Triplesplit Series, Vol. 1 was one of Sunset Alliance's first releases, and remains one of its best selling records. It features Andherson, Before Braille, and Fivespeed, each of whom would make an appearance on the popular Emo Diaries series on Deep Elm Records.[1][2][3] This record comes at an interesting time for each of the bands, as Fivespeed had just opened for Jimmy Eat World at the release of Bleed American,[4] Before Braille was preparing to enter the studio to start recording its full-length album,[4] and Andherson had a "homecoming" of sorts in the Phoenix metro-area with this release, after having relocated to Berkeley, California.[4]

Triplesplit Series, Vol. 1
Studio album by
ReleasedDec. 1, 2000
RecordedFlying Blanket Recording, et al.
GenreProgressive Rock, Indie rock, Emo
Length36:21
LabelSunset Alliance (SA 005)

Originally, as can be implied by its name, this was intended to be the first of a series of three-way split EPs.[4] In fact, plans were laid for a "Volume 2," which was to be released with Fueled by Ramen records and carry the bands Before Braille, Seven Storey, and the Go Reflex (Bob Hoag's post Pollen band).[4] However, the fact that Before Braille soon thereafter signed a contract with Aezra Records which prohibited them from contributing to any compilations,[5] and that the lead-singer of Before Braille, David Jensen, is also the owner of Sunset Alliance, may have and led future editions of this project to be shelved.

Track listing

No.TitleLength
1."Blood Over Wine by Fivespeed"3:23
2."Kid by Fivespeed"2:36
3."Fallen Through by Fivespeed"2:26
4."Select Start by Before Braille"3:19
5."Red Tape by Before Braille"3:27
6."Low End of Luxury by Before Braille"4:37
7."Perennial by Andherson"6:17
8."Small Concession by Andherson"3:52
9."41st by Andherson"6:24
gollark: Doesn't Nvidia have some FasterTransformers thing available?
gollark: You might just need to use a smaller model.
gollark: For more than a minute.
gollark: Int8 apparently causes it to just output random noise and I never got round to trying quantisation aware training for it.
gollark: It's quite strange that apparently BERT can be statically quantized without any extra training and retains decent accuracy but GPT-Neo emits nonsense going through the same process.

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.