Welcome!

Welcome to the blog for the Oberlin College Geomorphology Research Group. We are a diverse team of students working with Amanda Henck Schmidt on geomorphology questions. This blog is an archive of our thoughts about our research, field work travel notes, and student research projects. Amanda's home page is here.

Friday, December 23, 2016

And Now for Something Completely Different: Archaeology Post

Hey all,
My name is Lucas Brown, and I was doing some research this last semester with Amanda advising me, so she asked that I do a short post about my findings.

Over the course of this last semester I was doing a hydrology study of an area in Italy that has been dug by archaeology professor Susan Kane since the mid 90's. This area has been inhabited for thousands of years by various cultures; first the Samnites, then the Romans, and now modern Italians. In ancient times the area was heavily terraced, so I was attempting to find out if the ancient terracing had affected the watersheds and general hydrology of the area. To do this I used a modern DEM of the area and also created another DEM using data that has been collect on the local terraces. I then created two sets of watersheds to see if they differed.

I found that several of the watersheds that had many of the terraces within them did have some changes, showing that the terraces did affect the local hydrology. Below are several figures that show the general area and also the watersheds that had changes.
The Monte Pallano area within the Abruzzo region



Base DEM watersheds

Terraced DEM watersheds


Sunday, December 18, 2016

Hey everyone,

As the semester has come to an end, so has my time in the Geomorph lab until next Fall. Like Chloe, I'll be studying away in New Zealand next semester. As I wrote in my first blog post, research this semester has truly been a learning experience. I learned a lot not only about the subjects we are researching and how our methodology plays into it, but also about how research groups operate, particularly in a college setting.

After fall break, I quickly learned that there was a lot about the research that I didn't even touch on in this first semester, particularly the computational and qualitative aspects. After working with Chloe and Monica on leaching and running the CH-0xx samples, Chloe and I were tasked with running the numbers on as many of these samples as we could. Using a combination of several programs, we gathered and organized the data and eventually exported it to Excel. We then took the activity levels for lead-210 and cesium-137 and corrected these values taking into account radioactive decay, finally getting the numbers we were looking for. At the beginning of the semester, I had no idea how much number-crunching we would be doing, and it was really satisfying to have an actual number to point to and say "that's our data".

It has also been really cool to see that research is a work in progress for everyone. When I first joined the research group, I thought that the methodology was set in stone, but I quickly learned that this methodology was constantly being reviewed and refined through trial and error. I saw this process in action this semester with the introduction of neutralization into the leaching process, which has come with its own set of challenges, but has ultimately proven to be a successful idea.

I look forward to returning next Fall and doing more research!

Gabe

Tuesday, December 13, 2016

End of Fall Semester Update

Hey ya’ll! Chloe here.

The end of the semester is almost upon us. I hope everybody is getting through reading period and final exams in one piece. Whew, this was a long, fulfilling semester.
The semester began with Gabe, Monica, and I partnered up. We began working through the CH-0XX samples and managed to put a good dent in them through leaching and running. GSA was a big adventure way back in September and I am so grateful that I had the opportunity to go to Denver along with a bunch of other Obie geology majors. Getting to see Obie graduates was also a plus and I know that Monica and I both learned a great deal about how to present a research poster at a conference!
Moving past that, I was able to become more comfortable with the whole leaching process, even while the lab had to deal with acid rain in the fume hoods and the switch to using sodium hydroxide as a base. Using the base in order to neutralize the sample is not difficult and does not add on much more time to the overall leaching process, which is a lucky fix for our acid rain problem. And there is a lovely ~COLOR CHANGE~ in the leachate when adding the base, which is always entertaining. Using pH paper strips, we can just dab a drop of the solution onto the paper to see how acidic or basic it is. We are shooting for a neutral pH, which tends to come out rather greenish on the pH strip and some sort of orange-brown-red color in the beaker full of leachate. When heating up the leachate in order to dry it down, there is a substantial amount of salt that forms on the sides of the beaker. This sneaky salt can simply be scraped off the sides of the beaker back into the leachate, but it is definitely a cool new addition to the whole leaching process!   
Gabe and I also learned how to calculate for decay corrected activity in a variety of CH-0XX samples. This involved learning more about the software on the lab computers and what it is capable of. It was a bit tricky, but with the help of Amanda and one or two of our lab group members, it was easily done. Gabe and I have uploaded our results, which include samples from the CH-0XX group, onto the shared lab group Google drive. J
Well, I’m off to study abroad in ~New Zealand~ next semester so I won’t be posting until next fall. I hope you all have a wonderful spring semester and the leaching process continues to progress smoothly!

Signing off,
Chloe



The Dirt Lab is getting salty

Its times again for another dirt lab update!

As the semester comes to an end I can definitely say this has been a more eventful experience than last year. As mentioned in previous posts our methodology for readying samples to be run was and is a multi-step process. The samples must first be sieved, then leached and centrifuged, separated into leachate and residue and then dried. A key component we've recently just added is neutralizing the sample using Sodium Hydroxide.

Neutralizing the sample means that we are now evaporating off just water rather than highly acidic HCl. Though, adding this step to the procedure turned out to be quite the undertaking. First, just getting the NaOH took longer than expected. In the down time Hannah, Monica and I (but really it was mostly Hannah and Monica) had the exciting task of locating and preparing samples which had yet to be run. Once the base came in we started to get to work on coming up with a way to get the pH of the solution to as close to 7 as possible. Initially we thought using an indicator would be the best route. We thought wrong. Trying to observe color change in an already colored solution which creates a black complexation as base is added is like trying to thread a needle from space. Its just not going work in a timely or efficient manner. Also considering that the base and acid are different concentrations and that the acid is reacting with carbonates within the sample means that its not the most straightforward process.

Still, we tried to create a method for this madness. Mae Kate, Monica and I started out by adding a fair amount of methyl red to our yellow solution, turning it a dirty orange. We then tried adding half the amount of acids worth of base and saw it turn immediately black and brown. Unsurprisingly, we over shot this delicate process and created a solution that was extremely basic. Which we then corrected by adding a healthy shot of dilute acid (turns out it was 10M instead of 12) which promptly brought it back down an overly acidic pH range of about 2.

After this event Amanda suggested a titration system so I went about setting one up. I found some clamps, a titration beaker and using the methyl red and a slightly expired pH meter kit, was able to correct the first sample to around 7.5 pH.  The end result was an beaker of very nicely separated H2O and sample.

As this way of checking the pH of the samples was extremely tedious, Amanda was able to get us pH papers to test with. The only problem with using those is the possibility of losing some sample to the paper. In order to minimize that we're using a stirring rod and then trying to remove all drops from the rod, so that it is as dry as possible while still being damp from the solution to check acidity levels. 

So thats primarily what we've been up to this semester in the lab. Perfecting this new method of leachate and residue separation. We're still working out the exact numbers to achieve neutrality, right now the easiest way to know when to stop adding base is by a significant color change. We also need to figure out the new composition of samples in regards to the metal content to salt ratio. Previously we were just accounting for Iron and Magnesium in our program calculations, but now that we're precipitating a salt there are new variables to be considered. 

Until next time,

Marcus

Sunday, October 23, 2016

Geology Can Be Confusing Sometimes: A Memoir

Hey Everyone,

I'm Gabe, a 4th year (ish), and I just started working in the lab this semester. And it sure has been a heck of a time. I'll start by saying that I came into geology relatively late. Since coming to Oberlin as a transfer last year, I've been taking as many geo classes as possible, and finally worked up the courage to declare last semester. It's been an eventful and exciting year, and I've learned so much over the past three semesters.

With that said, though, I'd be lying if I said I didn't find myself scratching my head sometimes, especially when it comes to research this semester. I took Earth Surface Processes last semester, and while I remember learning about isotopes and geochemistry... things have a way of slipping away during summer vacation. At the beginning of this semester, I figured that since I'm now a member of a team that's largely dedicated to working with these concepts, it's probably a good idea for me to understand them. During one of our first lab meetings this year, Amanda drew a few pictures and explained to me how the different fallout radionuclides (most importantly: 7 Be 210Pb and 137Cs) reach the soil, how they are differentially adsorbed and stored, and how they are useful to geologists for understanding erosion. On a broad level, I understood the purpose of the research and the driving observations but... there were still a lot of questions, many of which I couldn't even figure out how to put into words.

Over fall break, Amanda gave me three papers which are all directly relevant to the research we do in the lab. Reading science writing, an art in itself, is still something I'm getting used to, but I made my way through all three papers unscathed. What I found really cool about these three papers was how differently they used the same relatively specific tools to investigate fairly different things. One paper, from 1992, was largely theoretical. The paper investigates how radionuclides, or "radiometric fingerprints," can be used to gather information on suspended sediment sources (another topic I should maybe read up on) because they are independent of many otherwise limiting factors like lithology and soil type. Another paper focused on how radionuclides can be used to distinguish different types of erosion, specifically different types of sub-surface erosion. This paper had a slightly more practical approach, and talked about how this distinction between erosion types can help guide erosion mitigation attempts. The third paper did not discuss radionuclides, but instead focused on the calculation of sediment yields in order to understand anthropogenic effects. This paper also had some fun historical context which even attracted my parents into talking about geochemistry with me. 

Reading these three papers, especially combined with the experience of leaching and running samples, was incredibly helpful. Although I still don't understand a lot of the the nitty gritty geochemistry, and may have skipped or skimmed a few paragraphs here or there, I definitely feel a lot more confident with my role in the research group as we enter the second half of the semester. And I'm certain that my understanding will only become more refined as the semester continues.

Tuesday, October 11, 2016

Let the leaching begin!

Hey everybody!

I hope you all are having a wonderful semester and enjoying the refreshing autumn weather. October break will soon be upon us!
I’ve had a very exciting first few weeks back at Oberlin. A large portion of the semester consisted of prepping for GSA back in September. That involved finalizing the leaching data that Adrian and I collected last semester and any new data collected over the summer by Marcus and Monica. Monica, Marcus, Adrian and I also spent time putting together a poster worthy of a presentation at GSA. And before I knew it, I was on a plane to Denver, Colorado to present at the conference!
It was a fantastic experience overall. I had the opportunity to see Oberlin alums, meet professionals from all different fields of geology, and even explore the city of Denver a bit with my fellow geo-nerds. The presentation of the poster with Adrian and Monica went smoothly and we spent the rest of our time at GSA going to geology talks and meetings. It was a very informative trip and even gave me a few new ideas about which fields of geology I may wish to pursue in the future.
And now on to…LEACHING! Our lab group has split up into smaller teams and everybody is embarking on the wonderful adventure that is leaching in the Carnegie geochem lab. Each lab member is getting to learn how to do the entire process and many samples are being leached every week. Monica, Gabe, and I are working on the CH-0XX samples (samples from the China trip a year or two ago).
As of right now, that is all that’s been happening. I’ll upload another blog post later in the year and hopefully have more to say. Monica and Gabe are both on my mini-team for the leaching project, so feel free to check out any blog posts that they upload and keep up-to-date on our progress!
Hope you all are enjoying the amazing ~fall~ weather!

Bye-bye!
-Chloe


P.S. Check out Marcus’ post if you want a quick refresher on the leaching process! J

Thursday, October 6, 2016

Mid Semester Update

Hello everyone,

The colors haven't started changing outside just yet, but inside Oberlin's geochemistry lab there's certainly a lot going on. This semester has been focused on leaching!A former student in the lab came up with a plan to obtain data without having to use all sorts of proxy methods and avoid getting unrealistic values. To do this we remove grain coatings from samples, increasing the accuracy of our measurements. This involves using a fair amount of HCl to completely soak the sample; after some trial runs we've concluded that a 4:1 ratio of acid to sample is the sweet spot. Our process involves soaking the sample for at least 24 hours in acid in nalgene containers which are mostly submerged in a sonic bath. This procedure is perfect for Halloween as the contents of the bottles turn a spooky orange-ish brown when they are done. After that we have to decant off the leached grain coatings from the rest of the sample, a meticulous process involving the centrifuge, De-Ionized water and lots of patience. This part is the most time-consuming overall, but once the two components are separated out its smooth sailing. Throw the container holding acid on a hot plate and let all the HCl sizzle off and put the other container holding the residue in the oven. When thats all over we should have to containers of one sample that are ready to be run through our gamma spectrometer. This is the process i've been working through alongside most of the others working in Amandas lab this semester and its been a good time so far. This half of field work isn't as glamorous as spending nearly three weeks in China but its still rewarding nonetheless. We've  faced a few hurdles this semester already but overall its looking to be very data rich end of 2016.

Once the semester is over we should have a computer full of meaningful data points which will make for a much more interesting post, full of analysis and maybe even some pretty graphs

But until then this is all I've got to offer,
Marcus

Monday, July 25, 2016

Young Geologists' Field Day

Hello hello,

Lab work this summer has been lots of fun, it’s also been flying by! I feel like I’ve only just arrived but here it is, the last week to get some work done. Just a few days ago Monica and I had the opportunity to go and collect samples locally from the Vermillion river. While not as exciting as a trip to China, it was definitely a fun experience. Before we could go, Monica and I had to round up the supplies from the many different rooms Geology holds within Carnegie and Severance.  Those supplies included several sieves and buckets, a fair few sample bags, a large spoon to collect the sample from the river and a handful of spoons to move the sediment from the sieve into the bags (super scientific, I know!) Some people who are more critical of their terrain may look at the photos and say that we were not in a river as much as we were in a drainage ditch on the side of the road; I would say to those people, you’re right, but where’s the fun in that? We can nitpick the topographical features of the beautiful state of Ohio all we want but a day that has field work in it is a good day.





Monica here, with an update on the sediment we collected. Unlike Marcus I had a bit of a rougher time, as I was the one sacrificed to ride in the trunk with the sieves and freshly-collected samples. After much labor and help/interference from Amanda’s children we managed to collect the samples we needed from the river to be used further on our leaching adventures. They were collected as part of a project that I’ve been working on to determine the right ratio of Acid to Sample for leaching. Unfortunately the previous sample I leached was not collected recently enough so the fallout radio nucleotide Be-7 dissipated. By quickly leaching this sample we should be able to get the data were missing out on. They are currently in the oven, evaporating off the ditch-water and getting ready to be stripped in acid.


To make the field photos more enjoyable, we included some Gerber-baby level cuteness toddlers (Amanda’s children), who accompanied us into the field. Photo credit to Amanda’s wonderful au pair, Jenny. Enjoy! 








Wednesday, July 13, 2016

Staying STRONG in the lab

Hello!
Monica here, celebrating my second week as a STRONG Scholar working in the geomorphology lab this July. I'm an incoming first-year and I hail from Shorewood Wisconsin (just north of Milwaukee). Currently I am considering doing Oberlin's 3-2 Engineering program with the ultimate goal of becoming an environmental engineer. I also am interested in Hispanic Studies and Politics. In my free time I enjoy folding origami lotuses, swimming competitively, and creating scavenger hunts.
Thus far at Oberlin, I have been working on some odds and ends in my research. From day one I started running calculations to test how changing the accuracy of parameters affects the efficiency (how active a sample is) in a program called Angle. The goal here was to discover how specific the parameters need to be to stay within the margin of error. On a larger scale, it helps us judge how much information we are required to know about the sample in question to receive accurate results when preforming calculations. The three parameters included sample composition, source height, and density.
I used a bunch of different methods of simplifying the composition percentages from nine elements into as few as one. Overall, I found that simplifying the data didn't have a very large impact. I was definitely surprised by that but at the same time I was relieved. Knowing that composition data isn't as important makes the process of running these calculations easier for others in the future.
Here's a graph showing five of the different methods I used with black lines above and below the x-axis marking the margin of error that thou shalt not cross:

I also studied a few different source heights and these turned out to be quite a bit more influential.

I found a similar pattern when it came to density as well.

Based on the graphs I made, it was apparent that these values needed to be somewhat correct to get accurate calculations. Because of that, my next step of lab work was to analyze 13 of the leachate samples from one of the labs previous expeditions to China. This presented numerous challenges, as leachate is the outer coating of sediment, separated using acid. The samples had a unique acidy smell and presented some difficulties in measurement. It was a double challenge to be using calipers for the first time on leachates that were fractions of millimeters thick. Not to mention, when the ordeal was over I had to pray that I washed everything thoroughly enough so it wouldn't get corroded. Despite my struggles in getting all the values to agree with one another, I was finally able to get results accurate enough to graph.
Ultimately I am really excited because my work will contribute to Amanda's research on a new way to more accurately measure Lead 210 and hopefully create a better system for quantifying this indicator that can be used to measure erosion.
Next in the process, Marcus and I will be leaching more samples to be examined later on and *fingers crossed* the hood will remain intact as the HCl evaporates. We also are all reading a series of Parsons and Foster papers (and their critiques) around the validity of using Lead 210 as an indicator of erosion which are laden with witty scientific dissing.
I'm hoping my next two weeks this summer of research will be just as enjoyable as the past two and I want to thank both Marcus and Amanda for facilitating this fantastic experience and helping me get adjusted. Geomorphology rocks!

Sunday, May 15, 2016

Please someone tell me how to write generic code for specifc tasks

Heyo,  Joe here, coming off of a great semester of working with Marcus to coerce our data to play nice with Python.  Our goal was to unite the geographic data we had for the area upstream of each point with the isotope data from Harbin, our hard working germanium detector, into a format that could be sensibly manipulated with Python for graphing and statistical purposes. The problem was twofold, figuring out a way to sensibly store and access data, and how to take that data and use it to make graphs that were understandable and looked nice. I dove into writing a whole tangle of functions to pull out the data of interest and Marcus became good friends with the matplotlib documentation, his only ally in the noble fight against the matplotlib library.

It must’ve been just about two years ago now that I first started to truly get my hands dirty with both Python and ArcPy, ArcGIS’s Python library.  I started with a simple goal, create unique watershed files for each point in a shapefile full of sample collection locations.  Through a combination of the ArcPy documentation, stackoverflow answers, and a dear friend of mine with far more Python experience than myself, I was able to create such a script.  It was tailored to my specific project, but I tried my best to make it something that could be reused for other projects.  Looking back now, I would do it all totally differently, but ya live and learn!

When I began work on my next script, which extracted spatial information for each watershed, I became consumed with finding my way around ArcGIS’s ‘table joins’ which is perhaps the most obtuse way to unite two sets of data.  I won't go into detail, but I accomplished my goal, learning a lot about how ArcGIS stores data in the process, and thus began the quest that still consumes me this day, which is to avoid using ArcGIS at all costs, offloading as much work as possible to Python.

In the fall, I declared, to no one in particular, my intent to secede from ArcGIS, and began work on a Python project to manage my data, which would only dirty its feet by dipping into ArcGIS as needed for certain spatial analyses, then whisking the results out of the clutches of whatever heinous file that Arc would create, and into the sanctuary of my Python datatype.  Progress was slow, mainly because I kept on trying to start over!  My code worked fine, but I was never satisfied with how it was structured, I wanted this to be something that people doing similar, but distinct, work could use.  I struggled with how to avoid design decisions specific to my project, which was hard to do when I was also trying to use it at the same time to do my actual project. Eventually looming deadlines (apparently you need “results” when you “present” at a “conference”) forced me to move forward, so I ended the semester with a datatype to store the data about my samples, some functions to grab that data, and some functions to graph it.

Now, from what you read above when you saw “functions to graph” you may have thought to yourself, “oh, this must be where matplotlib comes into play” and you would be right, if I had an ounce of sense in me.  For a reason I am unable to explain, I’m not sure if it was ignorance about the existence of graphing specific libraries like matplotlib, hubris, or just naive fondness for LaTeX, I decided to write functions to generate the markup to generate plots using the PGFPlots package for Latex.  This meant that instead of calling functions like plt.plot(), I was writing long format strings to generate a file in the LaTeX markup language.  The results were rather pleasing, but when I came back to the project in February with Marcus I thought a more straightforward approach would be appropriate.  When we found out about matplotlib, I thought, “Now here is the answer to all of our problems!  All we have to do is hook up the Python code that stores the data to matplotlib and out will come beautiful graphs”.  Sure, I thought we might have to do some tweaking to get graphs up to our very refined standards, but how hard could it be.  For that answer, see Marcus’s post.

The template...



...and the result!

So, as Marcus went off to figure out just how to make matplotlib give us graphs that could be read with ease, I went off to figure out how I could pull the data we wanted out of the jumble of samples we were working with.  Our dataset was a collection of 83 soil samples from three different field seasons.  For each sampling location we determined the area upstream of it, and calculated various geographic parameters.  

Now that I had gotten all this data, it was time to get organized logically.  This took some time, but boy was it worth it.  Once I knew that all the data would be have the same way, I wrote a series of functions (way too many functions, probably, but once you learn Lisp, there’s no going back) to return lists of the data we actually wanted to plot.  If each sample has activity and error values for 3 different isotopes, a thousand different geographic parameters, a location, links to files, lists of other samples that this sample is related, it’s not quite plug-and-play.  But it got done, and meant that if some of the values for our samples changed (as they often do) or if new samples got added or old samples got removed, as long as they conformed to the standards, we didn’t have to do a darned thing!  Just take the list of sample objects, plug it into the function that pulls out the data you want to graph, and then shoot the result of that into Marcus’s graphing code.  Badadbing badaboom!  

    Things should be smoother from here on out...until I finally figure out that perfect structure and write the definitive program for managing soil samples, computing watersheds, doing some spatial analyses, and plotting and tabulating the data.  Someday it will happen, and we will be better for it. 

It's been a great 2+ years working for OGRe, but I wouldn't be surprised if I come back in one form or another, even if it's just to preach about why we should be scripting more and clicking less.

Thursday, May 12, 2016

Lab Work (Spring 2016)

Hi all, 
Marcus here, and ready to share with you what i've accomplished in the while working in the lab this Spring. This project was quite an undertaking from the very beginning; I had just conquered the mighty task of “Hello World” in java when I found out that I’d be paired with Joe to prevent more Excel made graphs from entering publication. We were given the option to either work in R, a completely unfamiliar language to both Joe and I, or to see if there were any ways to get Python to co-operate. Enter matplotlib (MPL).
This library was so combative that we had to make use of a separate application called Jupyter, which already had MPL integrated in it, to start working. Determining a starting location was a task in itself. Our first graph made extensive use of an oh-too-kind stackoverflow users code which gave us a 7x7 grid of information. That was a lot to take in, so we started looking into how get more specific with what we were presenting. The next graphs we created were specifically targeting lead, in-channel vs overbank and resample vs original, including variants that had data that accounted for negative values and those that didn’t. The nature of working with lead values is that they all had crazy error-bars, which got to be distracting visually. Suffice to say, there were a lot of moving pieces that didn’t want to work together at first.
This is when we started changing up our approach, Joe had delegated me to be more in charge of creating the graphs themselves while she continued working on her already built code that was able to grab and pair related information sets. My job required me to understand what data I was being given and how to use that in the graphs I would be creating, so naturally I needed to understand, at least some parts of, the code Joe was already working on. Now my to-do list included: learning how Jupyter worked, learning how to use MatPlotLib, understanding how Joes code was structured. Maybe it would’ve been easier to work in R. Jokes aside, it was a somewhat daunting task, so I figured it’d be best to start by seeing how Joes code worked, as she was much more accessible than the authors of the other two applications.  
It was a really interesting experience getting to see how Joe went about setting up her code to retrieve data and return meaningful results. The structure she had set-up worked in a way that was fairly straightforward, so figuring out how to add to and build off of what was there already wasn’t too difficult. It was a nice introductory period before we got our hands dirty with MatPlotLib. As I mentioned earlier, we really started off making plots that looked nothing like what our final graphs became. It was definitely a learning experiencing as we figured out what worked best and how to get information dense graphs that were still intelligible. Stackover-flow and the many documentation web pages quickly became purple links in my google searches as to why this wasn’t working or how to change this seemingly obvious part of the graph. Changing from regular plots to scatterplots, playing with subplots and legends and even changing the font size, color and shapes all seemed to have specific intricacies that want to provide un-intuitive results for seemingly small modifications. Ultimately though, Joe and I were able to work through these bugs as they popped up and create some good figures in the process.

This semester was a great introduction to working in a lab and I'm exciting to see what the next several years will bring!

Tuesday, May 10, 2016

STEM Night


On Friday 6 May, the Geomorphology Group joined other geology students to participate in STEM night, an outreach event for 3rd-5th graders. We had a great time. Below are pictures of OC students interacting with elementary school students. They are playing with the stream table (with Marcus and Adrian), exploring mineral properties (with Andrea), and looking at fossils (with Alex). Sydney and Andrew aren't pictured.