Thursday, May 24, 2012

Matlab's userpath and how to use it

I've been having some trouble lately on my computer dealing with MATLAB's search path (specifically MATLAB R2011 A). I try to change the user path from the GUI via File -> Set Path..., which is the same as entering 'pathtool' in the Command Window, but my settings are never saved when I restart MATLAB. I suspect that the problem involves the permissions which are managed by Windows 7, which typically prevents me from changing files on my hard drive from within MATLAB.

I managed to change the user path by instead using the command sequence


where of course the second command points to my desired user path.

I discovered that the string containing of all the paths is contained in matlabroot/toolbox/local/pathdef.m, where matlabroot is the parent folder that contains the MATLAB program files. For example, my root folder is C:\Program Files\MATLAB\R2011a\. At the end of this file, there is the line

p = [userpath,p];

which appends the string in userpath to the front of the string of paths to all of the toolboxes (note that pathdef.m returns the string p).  Also of note is that pathdef.m is called by MATLAB's initialization m-file, matlabrc.m, which is located in the same folder and executed at MATLAB's startup.

Finally, there exists in the same directory an m-file called startupsav.m. It is a template which, when renamed to startup.m, contains any lines of code that one wishes to run at startup. I believe that these lines are executed once the Matlab GUI opens and well after matlabrc.m is executed.

I'm still not sure where userpath is defined.

Addendum: You may place your pathdef.m or startup.m files in the userpath directory, and I believe that these take precedence over those in the matlabroot/toolbox/local/pathdef.m file.

In my startup.m, I've added paths that contain files that I do not wish to be associated with MATLAB's toolboxes. The commands in this file look like

addpath some_directory\MatlabMPI\src\ -END

where the -END flag places this path at the end of the string returned by the path command.

Tuesday, May 15, 2012

The purpose of numerics, part 2

In a post that I wrote nearly two years ago, I wondered about the usefulness of numerics and simulations for science. After all, a simulation necessarily tests the outcomes of a model, and a model is only an approximation for the real world. Because of this, I placed a higher emphasis on understanding experimental data than on numerics.

At the time of that writing I was irritated by a trend I had found in papers to present numerical results that exactly matched an experimental outcome. This mode of thinking, I believe, consisted of
  1. performing an experiment;
  2. developing a model that explained the experiment;
  3. using computer software to show that the model can exactly reproduce the experiment and was therefore good.
Unfortunately the only value I can see in doing this is confirming that the model reproduces the experimental data. What is perhaps more interesting to pursue numerically is to explore the effect of varying the model's parameters on the observable quantities, which is especially true for simulations of multi-body interactions, like the Ising model. For cases such as this, either it is not feasible to explore the full parameter space of this model experimentally or the outcome of such models is too difficult to predict analytically.

Essentially I've restated what I said in that earlier post, but I needed to do it. It's easy, at least for me, to dive head first into developing a computer simulation because it's great fun, but if I don't slow down and think about why I'm doing the simulation, I risk wasting my time on a menial task that adds little to the body of scientific knowledge.

Monday, May 7, 2012

What is a good description for entropy?

"Insight into Entropy," by Daniel F. Styer, is a nice paper that appeared in the American Journal of Physics in 2000. In the paper, he argues for a qualitative explanation of entropy that involves two ideas: disorder and freedom.

Entropy as disorder is a common analogy given to students who are learning about thermodynamics, but Styer provides several arguments for why this qualitative description fails to adequately explain the idea. One such argument involves a glass of shredded and broken ice. Despite the fact that the ice has been shattered into many pieces, the entropy of the bowl of ice is less than that of an identical bowl filled with water. The water may seem to be more ordered because it is homogeneous, but it does not possess a lower entropy.

Styer's idea of entropy as freedom attempts to explain how systems can possess multiple classes of states (commonly known as macrostates) and how entropy limits the microscopic details of each class. In the game of poker, the probability of getting a royal flush is identical to any other five-card selection without replacement. However, the number of configurations that form a royal flush is extremely small, so the entropy of the class of hands forming a royal flush is low. This very low entropy class of poker hands restricts the possible configurations of the microstate—the description of what five cards are in one's hand—and completes the analogy with freedom. High entropy macrostates have greater freedom in choosing their microstate by having a larger number of microstates to choose from; low entropy macrostates (royal flushes, for example) have less freedom.

Styer does propose retaining the "entropy as disorder" description by suggesting that both the freedom and disorder analogies be presented simultaneously to negate any emotions commonly associated with either word. His example of such an analogy goes as "For macrostates of high entropy, the system has the freedom to choose one of a large number of microstates, and the bulk of such microstates are microscopically disordered."

Finally, on a different train of though: teaching ideas by analogy apparently must be done with sensitivity to the common emotions associated with a word. I've never considered this idea before, but will surely be mindful of it in the future.