Kelch-like protein 3

Kelch-like protein 3 is a protein in humans that is encoded by the KLHL3 gene.[5] Alternative splicing results in multiple transcript variants encoding distinct isoforms.

KLHL3
Available structures
PDBOrtholog search: PDBe RCSB
Identifiers
AliasesKLHL3, PHA2D, kelch like family member 3
External IDsOMIM: 605775 MGI: 2445185 HomoloGene: 79542 GeneCards: KLHL3
Gene location (Human)
Chr.Chromosome 5 (human)[1]
Band5q31.2Start137,617,500 bp[1]
End137,736,089 bp[1]
Orthologs
SpeciesHumanMouse
Entrez

26249

100503085

Ensembl

ENSG00000146021

ENSMUSG00000014164

UniProt

Q9UH77

E0CZ16

RefSeq (mRNA)

NM_017415
NM_001257194
NM_001257195

NM_001195075
NM_001362415
NM_001368867
NM_001368868

RefSeq (protein)

NP_001244123
NP_001244124
NP_059111

NP_001349344
NP_001355796
NP_001355797

Location (UCSC)Chr 5: 137.62 – 137.74 MbChr 13: 58 – 58.11 Mb
PubMed search[3][4]
Wikidata
View/Edit HumanView/Edit Mouse

Function

This gene is ubiquitously expressed and encodes a full-length protein which has an N-terminal BTB domain followed by a BACK domain and six kelch-like repeats in the C-terminus. These kelch-like repeats promote substrate ubiquitination of bound proteins via interaction of the BTB domain with the CUL3 (cullin 3) component of a cullin-RING E3 ubiquitin ligase (CRL) complex.[5]

Clinical significance

Mutations in this gene cause pseudohypoaldosteronism type IID (PHA2D); a rare Mendelian syndrome featuring hypertension, hyperkalaemia[6] and metabolic acidosis.[5]

gollark: There's also the secret tunnel network™, connecting exactly two locations with an 8000-block-long tunnel nobody ever uses.
gollark: And it turns out rails actually cost significantly more than I thought.
gollark: The rail thing isn't actually widely deployed since there are also unlimited `/home` locations.
gollark: I tend to play on lightly modded servers, so we have things like nether iceways and my automatically routed rail network there.
gollark: They're not deliberately making a weird pricing structure. The tokens are just a way to compact the input before it goes into the model. These things are often (partly) based on "transformers", which operate on a sequence of discrete tokens as input/output, and for which time/space complexity scales quadratically with input length. So they can't just give the thing bytes directly or something like that. And for various reasons it wouldn't make sense to give it entire words as inputs. The compromise is to break text into short tokens, which *on average* map to a certain number of words.

References

  1. GRCh38: Ensembl release 89: ENSG00000146021 - Ensembl, May 2017
  2. GRCm38: Ensembl release 89: ENSMUSG00000014164 - Ensembl, May 2017
  3. "Human PubMed Reference:". National Center for Biotechnology Information, U.S. National Library of Medicine.
  4. "Mouse PubMed Reference:". National Center for Biotechnology Information, U.S. National Library of Medicine.
  5. "Entrez Gene: Kelch-like 3 (Drosophila)". Retrieved 2012-04-26.
  6. "KLHL3 mutations cause familial hyperkalemic hypertension by impairing ion transport in the distal nephron". Nature Genetics. 44 (5): 609. Apr 26, 2012. doi:10.1038/ng0512-609.

Further reading


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.