Ubuntu DNS issues (last kernel update)

I just had some trouble with my laptop not resolving http addresses after the last update of the Ubuntu kernel. Seems like there are issues with the DNS server settings. I resolved the issue by renaming / removing the deprecated old DNS server settings file resolve.conf (/etc/resolve.conf). Removing or renaming the file restored all DNS dependent services.

Optimizing the cellulose extraction setup

[caption id="" align="alignright" width="350"] pliers to cut wood samples...[/caption]

The last month I have been optimizing the cellulose extraction method as I learned it at GFZ Potsdam. As most protocols are case specific some tuning is always needed. In my case I was confronted with very hard / dense wood, as common in tropical tree species. This caused serious trouble when applying the methods used in Potsdam. Mainly cutting up hard wood samples with a scalpel is dangerous at best. As such I opted to give sharp electric pliers a go. In all honesty, this works like a charm. Even for soft wood this should be the preferred method. You have far more control over the sample and don't run the risk of injuring yourself with a very sharp blade.

[caption id="" align="alignright" width="350"] Memmert hot water bath[/caption]

Other tweaks to the protocol include replacing the hot plates and glass containers to heat the samples with a Memmert hot water bath. I could borrow one from the people on the sixth floor (Food Safety lab if I'm not mistaken). This setup has a hood, a programmable thermostat and an auto refill mechanism when evaporation brings the water level below a certain level. These options take care of some of the inconveniences of the hot plate setup, mainly refilling the water bath minimizing the risk of running out of water heating the reagents. It will set me back only as much as two hot plates if I get funded. Win win...

Finally, the glass filter vials are cleaned by ashing the material left in the vials. However this method is limited by the size of the muffle furnace at hand. Given the fact that the ISOFYS lab supports varied research topics the furnaces are generally occupied. Alternatively a colleague, dr. Samuel Bodé, suggested oxidizing all material left in the vials with potassium persulfate (at 85 °C for an hour or two). This method is often used to measure the content of organic material, turning all organic material in CO2 and measuring this of-gassed CO2 content. However, in my case this method would take care of cleaning the vials using oxidation. The method is simple and will free up space in the muffle furnace.

UAV vegetation monitoring

Yesterday I took my quadcopter for a spin with a small camera attached to take pictures of the field below. The field where I do my test flights and general practice runs is an unmanaged grassland. The grassland has diverse structure depending on moisture and nutrient availability. The idea was to see if it was possible to quantify this diversity based upon texture metrics. Sadly the city is hosting the world cup sheep herding this week-end and therefore has cut the grass. I still gave it a try as texture patterns remained visible in the pictures I took (sadly without much ecological significance and mostly due to straw left behind by the mower). The picture below shows a subset of the original image (excluding the landing skids of the quadcopter). Note the differences in texture.

[caption id=”” align=”alignnone” width=”508”] B&W grassland image from UAV[/caption]

To quantify the complexity of the grassland I applied the FOTO method as described by Proisy et al. (2007). The FOTO method uses radially averaged Fourier spectra to capture differences in the dominant frequencies of a scene. Most common implementations of the method pertain to quantifying canopy structure, however this approach is universal as it can capture any difference in texture (from grassland to canopy or urban areas). The result of this classification is shown as an overlay of RGB colours on the original black and white image below. Although the camera is no high end camera and my quadcopter not top notch either the results show the potential of UAVs and basic image processing techniques to quantify grassland (bio-)diversity. Optimizing the camera (better optics) and the UAV platform for heavy lifting (hexa / octo setup) would make for an easily deployable setup to map (bio-)diversity on regular intervals at low cost.

[caption id=”” align=”alignnone” width=”508”] FOTO based grassland classification[/caption]

References:

Proisy, C., Couteron, P., & Fromard, F. (2007). Predicting and mapping mangrove biomass from canopy grain analysis using Fourier-based textural ordination of IKONOS images. Remote Sensing of Environment, 109(3), 379–392. doi:10.1016/j.rse.2007.01.009

 

PlantCam file ordening script

I recently got a copy of the data as retrieved by my PlantCam installed in Yangambi, DR Congo. Before processing could proceed I had to rename all files from the standard Wingscapes PlantCam format to a format compatible with the PhenoCam GUI.

I wrote a little script in bash that does just this job. If run within a directory of PlantCam images it will use the exif data to create the appropriate filenames and file structures as required by the Phenocam GUI. You find the code below. The code depends on the CLI exif program and takes one parameter, mainly the site name.

#!/bin/bash

# convert wingscape PlantCam files
# and moves the files into the desired file structure
# for easy processing with the PhenoCam GUI or toolkit
#
# NOTE: requires a running version of linux/Mac or cygwin
# with exif installed.
#
# written by Koen Hufkens, 24/08/2013

# get a list of all wingscape files
files=`ls *.JPG`

# pick your own sitename
sitename=$1

for i in $files;
do
	# extract date and time from exif data
	date=`exif $i | grep "Date and Time" | head -n 1 | cut -d'|' -f2 | cut -d' ' -f1 | sed 's/:/_/g'`
	time=`exif $i | grep "Date and Time" | head -n 1 | cut -d'|' -f2 | cut -d' ' -f2 | sed 's/://g'`

	# construct the final filename and rename (with copy not move)
	tmp=`echo "$sitename $date $time.jpg"`
	filename=`echo $tmp | sed 's/ /_/g'`	

	cp $i $filename

done

# sort files into the correct data structure (a folder for each month)

years=`ls *.jpg | cut -d'_' -f2 | uniq`
months=`ls *.jpg | cut -d'_' -f3 | uniq`

for i in $years;
do

echo $i

# if the year directory does not exist, create it
if [ ! -d "./$i" ]; then
mkdir ./$i
fi

	for j in $months;
	do

		# if the month directory does not exits, create it
		if [ ! -d "./$j" ]; then
		mkdir ./$i/$j
		fi

		#files_to_move=`ls *_${i}_${j}_*.jpg`
		mv *_${i}_${j}_*.jpg ./$i/$j

	done
done

 

Cellulose, finally!

Two days ago I finalized the cellulose extraction by homogenization of the sample material. These samples were put into a drying oven overnight. The result is a paper like material as seen in the picture. Next up is packing the samples in tin cups for 13C analysis.

Pagination


© 2018. All rights reserved.

Powered by Hydejack v7.5.1