Week+of+December+10+-+December+17

December 10
Today I arrived at Quarknet around 3:45 and am beginning to look at the Excel spreadsheet to look at possible new ways to isolate the cosmic rays. We have already made some reasonable conjectures as to how we will cut the data, so I feel that there is enough to work with in cutting out unnecessary data. This data will be removed by various methods, and of all the cuts that we have asserted, it really does not matter what order these cuts are made. Therefore, I think that it is appropriate to begin cutting at any of the parameters we have previously determined. I will start by cutting out all data that does not meet (or come close to meeting) the E/M=1 assertion. This must be true, as the detector mistakes the cosmic rays for being dimuon events, when in reality they are simply one muon moving at a straight trajectory throughout the machine. Thus, the equation E^2 = M^2 + P^2, when P = 0, reduces to E=M, or 1. Because I looked at the dataset previously and used scatterplots to look where cosmic ray trends stopped, I predicted that they end around 1055 events, so I will make my cuts starting just passed this number. Here is the data that I have extracted: I know that this image does not appear to be different in any way, simply because it isn't, but if you were to scroll down to the bottom, it would end at row 1199 (I chose to include a few extra rows just for good measure, knowing that there will be more cuts to be made).

Now that I have made this cut, I will look at this data with a scatterplot of the E/M data. This should hopefully be a straight line at a value of 1, showing that all of my data has an E/M value of 1. Here is the plot: Although this plot is not only a straight line, the data represented is accurate and consistent. Remember how I chose to add extra rows just for good measure. The straight line begins to veer off upwards at an x-axis value of 1055, the number of events that I found to hold true. Thus, I will make another cut by trimming off the extraneous events. I have now taken the data down to 1055, and it seems that this graph is a bit misleading because of the tiny tiny range that it has. As the line trails upwards, it only reaches 1.0007, which is so very tiny, essentially 1. Therefore, this system haws gone according to plan. Now I will cut the next parameter.

I chose to move on to phi1 + phi2. For reasons explained in the past, cosmic rays should be close to zero. However, when I look at this data, and look at the phi1+phi2 column, I see a wide range of numbers from -3 to positive 3. This could be a number of things. Either there is a mistake in the formulas of the excel sheet, which is possible but unlikely, or there are simply fewer cosmic rays than what I thought. This is more likely, but I should first investigate to make sure that the parameters are set correctly.

I have investigated the parameters, and it seems that everything is entered exactly how it should be. I have been thinking that the problem may be that the cosmic rays have more wiggle room on the phi axis than I thought, but it seems that the extent to which the data changes is a little too large. Here is the image of what I am gatherin on my plot:

By looking at this shape, it is confusing to understand why it varies so much. It seems like this is just a random calculation, when I expected this to be a definitive cut that would isolate the cosmics and be parallel to our conjectures. I will have to sleep on this, and investigate at a later date to try to find the answer as to why it is so random and inconsistent.

December 12
Today I arrived at Quarknet at 3:45. I tried to contribute a little more research last night, an began to look at my logbook until I realized that I did not have the excel spreadsheet on any of my personal computers. This is because the file is considerably larger than the possible file save to email to myself, but now that I have made a cut in the data, I should have no problem meeting the size requirements. So that I can work on this at home or at school, I will email this excel file to myself.

With that done, today I plan to work with my phi numbers of each phi category to see why I am getting the results that I am. Hopefully I can find some answers as to why the data is the way it is, even though I expect it to appear differently.

I decided to take a look at the two columns of phi side-by-side. I sorted by phi1 so that I could look at it from the perspective of phi, and what I saw was encouraging. After scrolling down to the first values of -1.57ish, I found that these particles all looked like cosmic rays, as these all had phi1 valuesof around -1.57, and phi2 values of 1.57, which is precisely what we wanted to see. Here is the data that I am talking about. As you can see, all of these particular phi values are quite consistent with cosmic rays. Because of this, I can rule out the possibility of our parameters being flawed, as this is proof that the cosmic rays are being calculated. Thus, I think that it is reasonable to assert that there are fewer cosmic rays than we though, so I will have to make another cut. This should then leave our phi1+phi2 column with values close to 0.

After talking to Dr. Loughran, we decided that it would be progressive to upload the data into a Many Eyes scatterplot. I created an account and pasted the data into ManyEyes, where it was created quite simply. I uploaded only the first 1055 events into the data, so I can clearly see the events in question. The results are looking good for cosmic rays, as there are clear trends that point strongly to this notion. Here is a graph of M vs Esum:

This plot means that E=M, which is exactly what we wanted to see. Next, I looked at phi1 vs M and found some interesting data, also strongly parallel to cosmic rays.

The heavy particles are huddled where we would expect them, while there are two interesting clusters of very light particles that are interesting. I will inquire about these to Dr. Loughran when I return, but for now, I have to leave for a movie showing for Theology class.