The Shannon entropy measures the expected information value of messages. As with thermodynamic entropy, the Shannon entropy is only defined within a system that identifies at the outset the collections of possible messages, analogous to microstates, that will be considered indistinguishable macrostates. This fundamental insight is applied here for the first time to amino acid alphabets, which group the twenty common amino acids into families based on chemical and physical similarities. To evaluate these schemas objectively, a novel quantitative method is introduced based the inherent redundancy in the canonical genetic code. Each alphabet is taken as a separate system that partitions the 64 possible RNA codons, the microstates, into familie...
The genetic code maps the sixty-four nucleotide triplets (codons) to twenty amino-acids. While the b...
<p>For HIV-1 Env amino acids 521–606, Shannon entropies of individual codons (red bars) and average ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
The canonical genetic code is the nearly universal language for translating the information stored i...
A comprehensive data base is analyzed to determine the Shannon information content of a protein sequ...
This paper studies the information content of the chromosomes of twenty-three species. Several stat...
We have developed an approach based on information theory to compute the structural information cont...
This paper studies the information content of the chromosomes of twenty-three species. Several stati...
<p>For HIV-1 Gag amino acids 148–214, Shannon entropies of individual codons (red bars) and average ...
<p>For HIV-1 Gag amino acids 250–335, Shannon entropies of individual codons (red bars) and average ...
We envision the molecular evolution process as an information transfer process and provide a quantit...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
This paper studies the chromosome information of twenty five species, namely, mammals, fishes, bird...
Information content of a polymeric macromolecule can be calculated in bits, by multiplying the numbe...
The transmission of genomic information from coding sequence to protein structure during protein syn...
The genetic code maps the sixty-four nucleotide triplets (codons) to twenty amino-acids. While the b...
<p>For HIV-1 Env amino acids 521–606, Shannon entropies of individual codons (red bars) and average ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
The canonical genetic code is the nearly universal language for translating the information stored i...
A comprehensive data base is analyzed to determine the Shannon information content of a protein sequ...
This paper studies the information content of the chromosomes of twenty-three species. Several stat...
We have developed an approach based on information theory to compute the structural information cont...
This paper studies the information content of the chromosomes of twenty-three species. Several stati...
<p>For HIV-1 Gag amino acids 148–214, Shannon entropies of individual codons (red bars) and average ...
<p>For HIV-1 Gag amino acids 250–335, Shannon entropies of individual codons (red bars) and average ...
We envision the molecular evolution process as an information transfer process and provide a quantit...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
This paper studies the chromosome information of twenty five species, namely, mammals, fishes, bird...
Information content of a polymeric macromolecule can be calculated in bits, by multiplying the numbe...
The transmission of genomic information from coding sequence to protein structure during protein syn...
The genetic code maps the sixty-four nucleotide triplets (codons) to twenty amino-acids. While the b...
<p>For HIV-1 Env amino acids 521–606, Shannon entropies of individual codons (red bars) and average ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...