What is entropy in the semantic view of nature?

Forums Philosophy of Science What is entropy in the semantic view of nature?

Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #6564

    In thermodynamics, entropy is commonly associated with the amount of order, disorder, or chaos in a thermodynamic system. An increase in entropy means an increase in disorder and a decrease in entropy means a decrease in disorder — or, positively speaking, an increase in information. Is this correct? And what does this mean from the perspective of the semantic view of nature? Is this decrease in order an empirical measure of irreversible time winding down the universe to a state of chaos at the end?

    #6568

    In thermodynamics, entropy is commonly associated with the amount of order, disorder, or chaos in a thermodynamic system. An increase in entropy means an increase in disorder and a decrease in entropy means a decrease in disorder — or, positively speaking, an increase in information. Is this correct? And what does this mean from the perspective of the semantic view of nature? Is this decrease in order an empirical measure of irreversible time winding down the universe to a state of chaos at the end?

    #6587

    To understand entropy we have to visualize two systems, one hot and the other cold. Energy flows from the hot object to the cold object, until they attain equilibrium. After that equilibrium, energy does not flow in either direction. Similarly, it is supposed that if many systems come into contact with each other one after another, initially there will be a thermal gradient due to which energy will flow in one direction but over time every object will have the same temperature and the flow will stop. So, energy basically flows from hot to cold–in one direction. And this one-directional flow of energy is supposed to constitute the thermodynamic arrow of time.

    In the semantic view, time is cyclic instead of linear. So energy still flows in one direction but nature is structured in a way that the energy lost by a system is gained again due to a closed-loop. So in the classical thermodynamic system energy spreads out linearly in all directions until there is no gradient left, and this is called the “heat death” of the universe. In a semantic system, this energy doesn’t escape to infinity. Rather it goes round and round in a cycle.

    The reason for this cyclic behavior is that space is closed and has a boundary. So the energy is reflected back from the boundary rather than escaping away to infinity.

    Yet another reason is that the universe is hierarchical so we can never attain uniformity in the universe. When a hot object is put in contact with a cold object we think that after some time these objects have equal temperature and so heat is not being exchanged. But a semantic way to think is that energy is flowing cyclically or back and forth between the two systems, so they appear to be in equilibrium because of a thermal current. According to classical physics, there is no heat flowing and in the semantic approach, a cyclic current is flowing.

    If you look at a very small segment of a cycle, it looks almost linear and under the impression of this linear time, we think that entropy is always increasing. But if we look at the larger segment of the cycle then we can see that entropy both increases and decreases. As long as we keep thinking about time linearly our theories will tell us about entropy increase. Only when we think about closed space, cycles of change, can we think about both entropy increase and decrease.

Viewing 3 posts - 1 through 3 (of 3 total)
  • You must be logged in to reply to this topic.