The branch of mathematics that deals with entropy is primarily statistical mechanics and information theory. Entropy is a concept that originates from thermodynamics but has been further developed and mathematically formalized in these branches of mathematics.
In statistical mechanics, entropy is a measure of the microscopic disorder or randomness of a system. It is defined in terms of probabilities and describes the distribution of states that a system can occupy.
In information theory, entropy is a measure of the uncertainty or information content in a random variable or a set of data. It quantifies the average amount of information required to represent or transmit a message from a given set of possibilities.
Both statistical mechanics and information theory provide mathematical frameworks to calculate and analyze entropy. These branches use concepts from probability theory, calculus, linear algebra, and other mathematical tools to study the behavior and properties of entropy in various systems, including physical systems, information systems, and communication systems.