Feeds:
Posts
Comments

In an earlier post, I have discussed a least-square fit with error on y-axis (for statistical fits, check the PyMC and EMCEE posts).  Almost in any fit, having an estimate of the fit uncertainty is a must. The better we know the noise characteristics of the experiment, the better we should estimate the uncertainty of the fit parameters. In this post, I show a more serious example in which we have error on both axis.

fit2Unlike the previous example, we do not use the curve_fit module of Scipy, Instead, there is  another  dedicated module to estimate the orthogonal distance regression (odr). The program with some comments is shown below:

import numpy as np
from pylab import *
from scipy.optimize import curve_fit
from scipy import odr

def func(p, x):

a, b, c = p
return a * x *x + b*x + c

# Model object
quad_model = odr.Model(func)

# test data and error
x0 = np.linspace(-10, 10, 100)
y0 = – 0.07 * x0 * x0 + 0.5 * x0 + 2.
noise_x = np.random.normal(0.0, 1.0, len(x0))
noise_y = np.random.normal(0.0, 1.0, len(x0))
y = y0 + noise_y
x = x0 + noise_x

# Create a RealData object
data = odr.RealData(x, y, sx=noise_x, sy=noise_y)

# Set up ODR with the model and data.
odr = odr.ODR(data, quad_model, beta0=[0., 1., 1.])

# Run the regression.
out = odr.run()

#print fit parameters and 1-sigma estimates
popt = out.beta
perr = out.sd_beta
print(‘fit parameter 1-sigma error’)
print(‘———————————–‘)
for i in range(len(popt)):
print(str(popt[i])+’ +- ‘+str(perr[i]))

# prepare confidence level curves
nstd = 5. # to draw 5-sigma intervals
popt_up = popt + nstd * perr
popt_dw = popt – nstd * perr

x_fit = np.linspace(min(x), max(x), 100)
fit = func(popt, x_fit)
fit_up = func(popt_up, x_fit)
fit_dw= func(popt_dw, x_fit)

#plot
fig, ax = plt.subplots(1)
rcParams[‘font.size’]= 20
errorbar(x, y, yerr=noise_y, xerr=noise_x, hold=True, ecolor=’k’, fmt=’none’, label=’data’)
xlabel(‘x’, fontsize=18)
ylabel(‘y’, fontsize=18)
title(‘fit with error on both axis’, fontsize=18)
plot(x_fit, fit, ‘r’, lw=2, label=’best fit curve’)
plot(x0, y0, ‘k–‘, lw=2, label=’True curve’)
ax.fill_between(x_fit, fit_up, fit_dw, alpha=.25, label=’5-sigma interval’)
legend(loc=’lower right’,fontsize=18)
show()

Please note that as you know, python is case sensitive so do not try to use change the upper/lower case in the above commands. A general comment is that you can easily change the second order  function of this example to any desired function. The method we used to estimate the uncertainties of fit parameters are the standard method using diagonal elements of the co-variance matrix.

In some earlier post, I have discussed statistical fits with PyMC and EMCEE. Advantage of statistical methods is that they are not sensitive to the form of chi-square function. This is important in some cases where the merit function doe snot have a well-define minimum. The advantage of chi-squaree methods is that they are generally much faster. In this post, I show a typical example of a least-square fit with measurement error. As usual, we are interested to estimate a fit parameter as well as their uncertainties.

fit1As you see in the above example, we fit a simple function with measured y-error, estimate the fit parameters and their uncertainties, and plot a confidence level of a given range. The program is shown below:

 

import numpy as np
from pylab import *
from scipy.optimize import curve_fit

def func(x, a, b, c):

return a * x *x + b*x + c

# test data and error
x = np.linspace(-10, 10, 100)
y0 = – 0.07 * x * x + 0.5 * x + 2.
noise = np.random.normal(0.0, 1.0, len(x))
y = y0 + noise

# curve fit [with only y-error]
popt, pcov = curve_fit(func, x, y, sigma=1./(noise*noise))
perr = np.sqrt(np.diag(pcov))

#print fit parameters and 1-sigma estimates
print(‘fit parameter 1-sigma error’)
print(‘———————————–‘)
for i in range(len(popt)):
print(str(popt[i])+’ +- ‘+str(perr[i]))

# prepare confidence level curves
nstd = 5. # to draw 5-sigma intervals
popt_up = popt + nstd * perr
popt_dw = popt – nstd * perr

fit = func(x, *popt)
fit_up = func(x, *popt_up)
fit_dw = func(x, *popt_dw)

#plot
fig, ax = plt.subplots(1)
rcParams[‘xtick.labelsize’] = 18
rcParams[‘ytick.labelsize’] = 18
rcParams[‘font.size’]= 20
errorbar(x, y0, yerr=noise, xerr=0, hold=True, ecolor=’k’, fmt=’none’, label=’data’)

xlabel(‘x’, fontsize=18)
ylabel(‘y’, fontsize=18)
title(‘fit with only Y-error’, fontsize=18)
plot(x, fit, ‘r’, lw=2, label=’best fit curve’)
plot(x, y0, ‘k–‘, lw=2, label=’True curve’)
ax.fill_between(x, fit_up, fit_dw, alpha=.25, label=’5-sigma interval’)
legend(loc=’lower right’,fontsize=18)
show()

Please note that using the measurement error is optional. If you do not have y-error, simply skip its command in the fit procedure:

# curve fit [with only y-error]
popt, pcov = curve_fit(func, x, y)

You still get an estimate for the uncertainty of the fit parameters, although it is less reliable.  In the next post, I show an example of a least-square fit with error on both axis.

It is a common practice in programming to use a variable as template for another and then later changing the value of the new variable, something like

b=a

b = 3

at this point, variable a still has its old value, it was not replaced by 3. So far so good. This works in python like other programming languages, but what about compound objects (compound objects contain other objects = lists, dictionary, tuple)? Compound object will NOT work like this.

Lets have a closer look into problem. We can check the memory address of a given variable like this:

print(id(a), id(b))

the id() function returns memory address of its variable so one can instantly check if the two are pointing to the same address or not.

In case of compound variables, the situation is different as we start from a shallow copy:

x1 = [“a”, “b”]
>>> x2 = x1
>>> print(x1)
[‘a’, ‘b’]
>>> print(x2)
[‘a’, ‘b’]
>>> print(id(x1),id(x2))
43746416 43746416

> x2 = [“c”, “d”]
>>> print(x1)
[‘a’, ‘b’]
>>> print(x2)
[‘c’, ‘d’]
>>> print(id(x1),id(x2))
43746416 43875200

x1 and x2 are originally pointing to the same memory address but as soon as one variable is modifies, it allocates a new memory address: this is a shallow copy. It is so because our list in this example is not nested. In this case if we change only one element of x2, x1 will be modified as well:

>>> x1 = [“a”, “b”]
>>> x2 = x1
>>> x2[1] = “h”
>>> print(x1)
[‘a’, ‘h’]
>>> print(x2)
[‘a’, ‘h’]

x1 and x2 are originally pointing to the same memory address but as soon as one variable is modifies, it allocates a new memory address: this is a shallow copy. It is so because our list in this example is not nested. Now if we change only one element of x2, x1 will be modified as well:

>> x1 = [“a”, “b”]
>>> x2 = x1
>>> x2[1] = “h”
>>> print(x1)
[‘a’, ‘h’]
>>> print(x2)
[‘a’, ‘h’]

be careful !

we had actually two names for one variable, it was just a shallow copy, we did not assign a new object to x2. To avoid this problem, use deep copy:

from copy import deepcopy

x2 = deepcopy(x1)

 

hopefully it helps to prevent confusing errors in your python codes.

Debian, Fedora, Kubuntu

I use Debain since Sarge. Back then, mounting a USB disk was far from trivial. It is so much user friendly now. For a while between 2014-2016, I have used Fedora and Kubuntu instead. Both are more appropriate if you have a modern laptop as they are faster in incorporating new kernels in their distributions (compare a life cycle of 6 month vs 2 years). This particularly affects things like sound, LAN, and wireless. Without an internet connection, it is difficult to update the system in first place, if it cannot recognize your network interface. I have to mention that using geeky tricks, it is doable to get a work around solution, as I usually do not get the bleeding edge new laptops. However a more standard solution is to live with either Fedora or Kubuntu, and a year later install Debian: by then, the latest (testing) kernels usually are modern enough to support the main functions like network.

Doing science and mathematical calculations, it is frustrating to see that a system crashes every now and then for nonsense reasons: Trash crashed, Copy crashed, …. these stuff I have never ever experienced on a  stable Debian system, so the least is that I spent a week to get everything working with Debian before giving it up.

There are also some more fundamental differences in philosophy of Debian versus Fedora or Kubuntu. While you get bug fix and minimal updates for stable software, an update never actually changes a package dramatically. So if you want to see a new functionalities, it does not arrive before a major release. Debian is very stable and it makes it very interesting for systems with heavy jobs.

Unlike the Venus transit which is a rare event, there is a Mercury transit every 7.5 years (on average). Between 1601 and 2400, there are only 14 Venus transit. If you have not seen the last one on June 06, 2012, it is too late  since the next Venus transit occurs on Dec 11, 2117 !! Therefore, we should focus on the Mercury transit.

The next transit of mercury occurs on May 09, 2016. It starts at about 11:12 UT and ends on 18:42 UT. The transit is visible in most of Asia, Europe, Africa, and America. Note that the apparent size of the solar disk is about half a degree while the apparent size of mercury disk is only 8 arc seconds (1 degree = 60 arc minute = 3600 arc second).

The next mercury transits in this century are the followings (time in UT):

Date                    start           max          end

Nov  13, 2032      06:41         08:54      11:07

Nov 07, 2039       07:17         08:46       10:15

May 07, 2049       11:03         14:24       17:44

Nov 09, 2052       23:53         02:29       05:06

May 10, 2062       18:16         21:36       00:57

Nov 11, 2065        17:24         20:06      22:48

Nov 14, 2078        11:42         13:41      15:39

Nov 07, 2085        11:42         13:34      15:26

May 08, 2095        17:20         21:05      00:50

Nov 10, 2098        04:35         07:16      09:57

Losmandy G11 mount

I have used the famous Losmandy G-11 mount for a few years. It is a super heavy mount. You need quiet a bit of force to move it around. The mount was serviced recently and with a rough polar alignment, it allowed us to have exposures of up to five minutes at a focal length of 300 mm.

close-up view of Losmandy G11 mount.

close-up view of Losmandy G-11 mount.

The Losmandy G11 mount.

The Losmandy G-11 mount.

The ball-head to connect DSLR cameras.

The ball head to connect DSLR cameras.

The ball head allows you to point the camera to any direction of the sky very easily. Depending on the weight of the camera and lens, you should think of a ball which has a diameter of at least 30-40mm for a 5 kg load.

Debian 8 (Jessie)

Recently I bought a Lenovo T430 on eBay. The main reason I didn’t like T440 or T540 was it’s absolutely horrible touchpad ! After buying the Lenovo T430 with Windows 7, the most trivial step was to install a Linux on it. For that, I have experienced Debian, Kubuntu, Fedora, and Centos for two weeks on VirtualBox on my old laptop (T530). Since I was using Debian from Debian Sarge (3)  till Debian Wheezy (7), it was not a surprise to me that I converged to Debian.

Debian made a fantastic job and in the live environment, it had no problem with lan, wireless, or sound. I had a familiar package manager, synaptic, and instantly after installation activated some multimedia and backports for non-free libs.

Debian Jessie Desktop

Debian Jessie Desktop

Debian automatically configured the F-keys so all of them work out of the box. The only thing I had to change was the behaviour of the touchpad, so I can tap on the touchpad and the system recognise that as a click.

I installed KDE 4 as my default desktop environment and all of the familiar packages like emacs, kstars, digikam, gimp, etc.

Numeric libraries like ATLAS, LAPACK, BLAS, FFTW, CFITSIO, openmpi, etc can be easily installed either via synaptic or simply running the following as root:

#apt-get install fftw

Compared to old days, Debian is now so user friendly that it does not require much experience to either install or configure it.

Science Packages

Debian also offers “science” packages like science-physics, science-mathematics, etc. Each science package is a group of many packages in that category. So installing one science package will install all of them. There are also -dev version for science packages.

Firefox on Debian

There is no particular advantage over Debian brand of Firefox called icedove and the Mozilla versin of Firefox. However I decided to install Firefox instead of icedove since I had Fedora17 on my previous laptop so I could move all my Firefox profile like bookmarks, saved passwords, etc directly to Debian, and it was fairly easy.