In this experiment, we used Beer’s law, which states that absorbance, i.e. the amount of light absorbed by a solution, is equal to the molar absorptivity of a compound, times the path length of the cuvette, times the concentration of the solution. First, we measured the absorbance of a cuvette filled with water, then calibrated the spectrometer by subtracting this absorbance from our future measurements. Then, we measured the absorptivity of a 0.4 M solution of CuSO4 and found the wavelength at which this solution had an absorbance of 1. Next, we measured the absorbance of solutions of CuSO4 with known concentrations at the …show more content…
Next, fill the cuvette ¾ full with 0.4 M CuSO4 solution and place it into the spectrometer. Then, have the spectrometer record the absorbance of the solution over wavelengths ranging 380-750nm. Find and record the wavelength at which the absorbance was 1.Finally, rescale the graph to a maximum absorbance of 2.0. Save the data, and print it out later.
7. Next, in the LabQuest, change the mode to “Events with Entry”, and on the ABS screen change the wavelength to that recorded in step 6.
8. Then, fill and place a cuvette with the first known concentration vial. Start the data collection on the sample, and when stable hit keep, and enter the concentration when prompted.
9. Repeat step 8, with the remaining known vials. Then, hit the red button to stop data collection. Save all data.
10. Next, autoscale the graph and add a linear best fit line.
11. Finally, add one of the unknown solutions to the cuvette and place in the sensor, and record the absorbance of the solution. Repeat for all the unknowns.
12. Save all data and print out both the spectrum graph and the absorbance vs. concentration graph as well as a table of all the absorbances and concentrations for the standard