Umbar, Shalimar B.
FORENSIC 1 1/24/22
2ND YEAR COLLEGE
BS – CRIMINOLOGY
1. What exactly is a camera obscura and how was it used?
It is dark inside, but there is a small hole in one of the walls. A camera obscura is
the simplest type. It lets light in when it's sunny outside, which makes a picture of the
outside world that's upside down show up on the wall next to the hole. It is dark
inside, but there is a small hole in one of the walls. A camera obscura is the simplest
type. It lets light in when it's sunny outside, which makes a picture of the outside
world that's upside down show up on the wall next to the hole. Did you know that the
term "camera obscura" comes from the Latin language and means "dark room"?
It doesn't matter that our Camera Obscura is a little more complicated than this one,
because it works the same way, too! A big tube with three lenses is at the top of our
tower, and a mirror at the top of the tower is tilted so that light comes down from it.
Direct the light onto a white wooden table that has a picture of the outside scene on
it. Because the mirror can be tilted and rotated, we can get a 360-degree picture of
Edinburgh. This is made possible by lenses that let us see the image in the right
orientation.
2. What is significant events occured during the pre photographic history?
At the close of the century, photographic technology had advanced to hand cameras
and dry plates, enlargers and rapid printing paper, and more powerful lenses and
high-speed shutters. Although nascent, the development of successful color
photography was still several years away.
- The Camera Obscura, It has been around since the 5th century B.C.
The camera obscura was a device that existed long before the advent of the
photographic camera. They were created from darkened rooms or enclosed boxes
with a small opening on one side, which was literally translated as "black chambers"
in the original language. This "pinhole" allowed for light to enter the chamber, which
was then used to project images onto a wall or screen, and then onto yet another
wall or screen, and so on. However, it wasn't until the 11th century that the Arab
scholar Alhazen gave a full explanation of how this optical phenomenon functioned,
indicating that it was almost certainly known to the ancients (both Aristotle and the
Chinese philosopher Mozi made note of it). He drew a rough picture of what he
believed to be a working model of the phenomenon. Following that, the camera
obscura gained popularity as a tool, particularly during the Middle Ages and
Renaissance, especially as innovators began to use biconvex lenses to improve the
quality of the images produced by the device. It was used by astronomers to cover
their eyes when studying the sun and solar eclipses, and it was also employed by
painters to make portraits and landscape paintings, among other things.
- Photochemistry The period between the 18th and 19th centuries was a period of
invention.
The camera obscura enabled real-time viewing of images; nevertheless, it took
decades for researchers to establish a technique of permanently retaining images
through chemical processes. It was in 1725 that the German scientist Johann
Heinrich Schulze discovered that when silver salts were exposed to ultraviolet light,
they darkened. This discovery was one of the most significant advances in science.
He was intrigued, so he cut the letters from a sheet of paper and laid them on top of
the silvery-colored concoction to demonstrate his point. As he described, "before
long, I noticed that the sun's rays...wrote the words and phrases on the chalk
sediment in such a correct or straightforward manner that many people were led to
believe that the results were the result of various artifices." In 1827, a French
inventor named Joseph Nicéphore Niépce used a camera obscura and a lens to
create the first photograph ever taken in the world. Photographs are captured and
"fixed" using a pewter plate coated with a light-sensitive material known as Bitumen
of Judea, which is applied to the surface of the plate. According to current thinking,
his eight-hour-long exposure of the courtyard of his house is regarded the world's
first photograph.
- the Daguerreotype The camera was invented in 1837 by Louis Daguerre, a
French artist and inventor who collaborated with Niépce in the late 1820s to make
photography's next great leap forward. Niépce was the first to use a camera.
Photographic images were first created when Daguerre discovered in 1837 that
exposing iodized silver plates to light generated a weak picture that could be
developed with the help of mercury vapors. This finding has a significant impact on
the field of photography. Not only did the new technology generate a better and more
polished image, but it also reduced the exposure time from many hours to as little as
10 or 20 minutes, allowing photographers to save both time and money. While
working on his novel photography method, Daguerre came up with the term
"Daguerreotype" to characterize what he was doing. The French government
provided him with a pension in exchange for his agreement to publish his findings in
the journal Nature, which he consented to do in 1839. He stated that he will make it
available to the public. His innovation spread around the world and spawned a
thriving motion picture industry, particularly in the United States, that has survived
until the present day.
- Calotype (year of publication): 1841
"Calotype" is the name given by British inventor William Henry Fox Talbot to his own
photographic technique, which he developed in 1839. When "Daguerreotypomania"
began to spread, Talbot devised his own photographic method. At the same time, a
phenomenon known as Daguerreotypomania began to spread throughout the world.
Instead of using metal plates, this approach made use of high-quality photosensitive
paper, which was far more affordable. When the paper was illuminated, it generated
a latent picture that could be developed and preserved by washing it with
hyposulphite and reusing it again and over again for a long time. Although the results
were not as clear as Daguerreotypes, they did have one significant advantage: they
were significantly less difficult to produce. It differed from Daguerreotypes, which
could only produce a single image from a single negative, as described above.
Photographers were able to make an endless number of copies from a single
negative using the Calotype technique. This way of capturing photographs will be
one of the most significant things to know about photography in the foreseeable
future..
- The Wet-Collodion Process was invented in 1851.
Photography as we know it today, including daguerreotypes and calotypes, was
rendered obsolete in 1851 by the invention of a new photographic technology by a
sculptor named Frederick Scott Archer, which combined clear visual clarity with the
ability to reproduce negatives in a short amount of time. The secret to Archer's
success was collodion, a chemical that was originally developed as a medicinal
remedy but which later proved to be extremely effective as a method of coating light-
sensitive solutions onto transparent glass plates. However, despite the fact that these
"wet plates" reduced exposure times to a matter of seconds, the process of utilizing
them was often a time-consuming endeavor to begin with. Photographers who
intended to capture photos in the field were required to travel with portable darkroom
tents or wagons because the plates needed to be exposed and processed before the
collodion mixture dried and hardened, which required the use of portable darkroom
tents or wagons. However, despite this drawback, the wet-collodion process's
unparalleled quality and inexpensive cost quickly made it a commercial success. Wet
plate photography is most well-known for the work of Mathew Brady, who used wet
plates to capture the features of his subjects during the American Civil War, and for
hundreds of other magnificent combat photographs acquired during the war.
- Dry Plates (from 1871 to 1878)
Photography was made impossible for anyone who did not have a basic
understanding of chemistry for much of the nineteenth century because of the usage
of a wide range of deadly solutions and mixes required by using a camera. All of that
changed in the 1870s, when Robert L. Maddox and colleagues developed a new
form of photographic plate that kept silver salts in gelatin, paving the way for the
development of modern photography. Because they retained their light sensitivity for
extended periods of time, they could be packaged and mass-produced, saving
photographers the time and frustration of having to prepare and develop their own
wet plates while on the road. Aside from allowing for significantly shorter exposure
times, dry plates also enabled cameras to capture moving objects with greater clarity.
Using dry plate cameras that he designed, photographer Eadweard Muybridge
captured some of the most famous studies of people and animals in motion
throughout the 1880s. Since the commencement of his experiments, he has been
credited with being a pivotal figure in the development of the motion picture industry.
- Flexible Roll Film (from 1884 to 1889) :
It wasn't until the mid-1880s that photography became genuinely accessible to the
general public, thanks to the efforts of inventor George Eastman, who began
manufacturing photographic film on rolls. In comparison to bulky glass plates, film
was lighter and more durable, and utilizing a roll allowed photographers to shoot
several shots in rapid succession without having to stop and change lenses.
Customers could use the Kodak camera, which had a 100-exposure capacity, and
then return it to the manufacturer to have their photographs developed. Eastman
used flexible film as the primary selling feature for his first Kodak camera, which was
introduced in 1888. This marked the beginning of the modern era in the history of
photography. It was extremely simple to use—Eastman sold it to Victorian
shutterbugs with the motto "You push the button, we do the rest"—but the coated
paper film that it used generated photos that were poor, especially when compared to
images produced by modern cameras. As a result of the introduction of celluloid a
year later, film would continue to be the primary method of photography for nearly a
century, until the invention of digital cameras in the late 1980s.
- Autochrome: 1907 (autochrome)
The desire for color photography was almost as old as the invention of photography
itself, but it wasn't until 1907 that a practical method for producing color photographs
was developed. On what year did Louis and Auguste Lumière, the Lumière brothers,
begin selling an additive color process known as "Autochrome," which was
developed by the Lumière brothers, who are widely considered as early film
pioneers? An unexpected source of inspiration for the Lumieres was a potato, which
proved out to be the key to their brilliance in the end. They were able to create
brilliant, stunning images by mixing minute grains of colored potato starch into a
panchromatic emulsion, which was then developed. This was a significant step
forward from previous attempts at color photography. This was widely regarded as a
watershed moment in the history of color photography when Eastman Kodak
Company introduced their iconic Kodachrome film in 1936 to replace Autochrome as
the world's most popular color film technology, marking the beginning of the modern
era of color photography.
3. What are the advances in photographic technology that occured during the
19th century
Through the turn of the 20th century, photographic technology had improved
significantly, and now comprised hand cameras and dry plates, enlargers, and quick
printing paper, in additional to more powerful lenses and high-speed shutters.
Despite the fact that color photography was still in its infancy, it would be some years
before it achieved widespread success.
4. What are the significant events relevant to the beginnings of forensic
imaging? In Visual aids have been applied in forensic and medical-legal
investigations almost as long as the profession itself. Visuals seem to assist
individuals understand what they hear. Imaging techniques have also helped
enhance standard test results since the dawn of forensic medicine in the 1800s. The
medical examiners were the first to sketch lesions or crime scenes. Photos were
employed to record crucial medical and legal outcomes. Konrad Roentgen's
discovery of X-rays was a major scientific event. This enabled scientists to examine a
body's interior as well as its exterior. X-rays have been employed for forensic,
anthropological, medicinal, and legal purposes for over a century.
Modern imaging tools have grown in importance in forensic and legal medicine
during the last decade. Why? Rapid cross-sectional imaging methods in radiology
have led to an explosion of novel ways to examine into a body quickly. Using modern
computer tools, researchers may now see fresh discoveries in two and three
dimensions. Photos and 3D modeling have also become more accessible. This
makes them more valuable not only in the lab but also on the crime scene.
In forensic medicine, however, new procedures bring new issues. What are the
drawbacks of using these methods? They should only be used by professionals.
What are the key medical-legal issues to watch out for, and are the approaches used
good enough? Only scientific inquiry can eliminate all of these ambiguities. That's
why it's critical to assess new ideas' pros and cons. But there's one more thing to
remember. Any approach is only as good as the person who uses it. As a result,
forensic professionals must constantly master new technologies to appropriately use
it.
This special issue intends to inform the forensic community about existing
methodologies, new advancements, and outstanding topics. A case study illustrates
how approaches might be applied in practice. These stories also provide a wide
range of content for review articles. It also uses technical notes to discuss technical
issues. The original research also examines open questions. This issue is dedicated
to forensic imaging approaches that can be utilized anywhere. A body can be viewed
in numerous ways, from simple photography to advanced imaging techniques like
MRI and Multi-Detector Computed Tomography (MDCT). Finishing touches include
high-resolution 3D modeling. The authors address a wide range of topics, from legal
medicine to anthropology and archaeology. Some contributions discuss legal issues
related to forensic imaging, while others discuss a wide spectrum of forensic
sciences.
This issue is a fantastic place to start for those new to forensic imaging or those
currently working in the field. It covers a wide range of topics and may be useful for
both. The reader will have a greater understanding of the medical-legal challenges
that imaging can address, as well as the ones that remain unresolved in this
discipline.
5. Who is Alphonse M. Bertillon and what are his contributors to the field of
criminology?
Bertillon was the son of medical professor Louis Bertillon and a French criminologist
and anthropologist who pioneered the use of physical measures, photography, and
record-keeping by police to detect recidivist criminals in 1853. Bertillon was born in
1853 and died in 1914 in Paris.
6. What are the important developments in Forensic imaging that occured in
the 20th century?
The Henry system of fingerprint identification was developed by the London police
organization Scotland Yard in the early 1900s. The Henry system was developed in
Bengal in 1897 by Azizul Haque and Hemchandra Bose. It was a simpler Galton
system that worked well. The book used Galton and other scientists' work (see the
previous post). Henry also helped the detective department set up a fingerprint
bureau the following year. In 1902, an intruder named Harry Jackson was
apprehended and convicted after leaving a thumbprint on a crime scene's window
sill. This was the first case in UK history when fingerprints were used to convict. Most
European countries had fingerprint divisions by 1903. New York State was the first
state to employ fingerprints for prisoner identification, ushering in the fingerprint era.
This new fingerprinting technology has largely supplanted the traditional
anthropometry method, which entailed measuring various body parts, such as arm
length and head size. The Herny technique was used until the late twentieth century.
1
Karl Landsteiner (1886-1943), an Austrian scientist, studied human blood types in the
1900s. His findings impacted forensics and medicine, not only blood types.
Landsteiner's work on the ABO blood groups made blood (and eventually organ)
transfusions safe. Detectives could perform a simple blood type match to determine if
two samples of blood came from the same suspect or person of interest. In addition,
Paul Uhlenhuth (1870-1957) developed a chemical test that could distinguish
between human and non-human blood the same year. 2 A significant change in
blood occurred in the early twentieth century. Walter Specht invented luminol, a
blood-based presumptive test, in 1937. A lot of blood glows when luminol is poured
on it (and frighteningly). Luminol is a chemical that frequently appears on false
forensic shows. 3
Forensic geology is a relatively young forensic science that has been utilized to
investigate crimes since the early 1900s. Georg Popp employed earth layers and
plant detritus to link a suspect to a crime and show a link between them. 4
Finally, a French forensic scientist named Edmond Locard (1877–1966) proposed
that "every transaction leaves a trail." The Locard exchange principle is still employed
today. Locard may have been the world's first true forensic scientist, and his work on
fingerprint analysis revolutionized the discipline. Despite his frequent association with
exchange, he is not known to have stated that "every transaction leaves a trace." It's
extremely difficult to commit a violent crime without leaving proof. Microscopes can
find and classify particles left on clothing. 2
Locard's exchange principle is taught in forensics classes. This is one of the first
things students learn in the field. We'll learn about another significant development
when we revisit the forensics timeline. This time, we'll learn about changes in
scientific analysis.