One of the most infamous World War II inventions is the atomic bomb. In August 1945, the United States launched its first (and so far, only) nuclear attacks on Hiroshima and Nagasaki, killing an estimated 110,000 to 210,000 people.

While the bomb stands out for its devastating impact, there were many other nonlethal innovations during the war in the fields of medicine and technology that have drastically reshaped the world.

Some of these innovations were based on research or designs predating the war that weren’t able to take off until the U.S. or British governments funded these projects to help the Allied forces. Here are six innovations that came out of that development surge.

1. Flu Vaccines

Ralph Morse/The LIFE Picture Collection via Getty Images
A guinea pig being inoculated to determine type of pneumonia and aid in diagnosis of other infectious diseases on the U.S.S. Solace, Navy Hospital Ship, c. 1942. 

The influenza pandemic of 1918 and 1919 had a major effect on World War I, and it motivated the U.S. military to develop the first flu vaccine. Scientists began to isolate flu viruses in the 1930s, and in the 1940s, the U.S. Army helped sponsor the development of a vaccine against them.

The U.S. approved the first flu vaccine for military use in 1945 and for civilian use in 1946. One of the lead researchers on the project was Jonas Salk, the U.S. scientist who would later develop the polio vaccine.

Pandemics: Full Coverage

2. Penicillin

Bettmann Archive/Getty Images
<em>Injured British Pvt. F. Harris waits for a medic to inject penicillin in preparation for an operation on a hospital train on its way to a station in England. Harris was wounded during an attack on a position in Normandy.</em>

Before the widespread use of antibiotics like penicillin in the United States, even small cuts and scrapes could lead to deadly infections. The Scottish scientist Alexander Fleming discovered penicillin in 1928, but it wasn’t until World War II that the United States began to mass-produce it as a medical treatment.

Manufacturing penicillin for soldiers was a major priority for the U.S. War Department, which touted the effort as “a race against death” in one poster. Military surgeons were amazed by how the drug reduced pain, increased the chance of survival and made it easier for nurses and doctors to care for soldiers on the battlefield.

The United States considered the drug so critical to the war effort that, to prepare for the D-Day landings, the country produced 2.3 million doses of penicillin for the Allied troops. After the war, civilians gained access to this life-saving drug, too.

WATCH: D-Day: The Untold Stories on HISTORY Vault

3. Jet Engines

Power Jets Ltd/SSPL/Getty Images
The first jet propulsion engine designed by Frank Whittle, c. 1938. In May 1941 the jet-propelled craft took off from Cranwell in the first real proof that jet propulsion was a viable alternative to the propeller.&nbsp;

Frank Whittle, an English engineer with the Royal Air Force, filed the first patent for the jet engine in 1930. But the first country to fly a jet engine plane was Germany, which performed a flight test of its model on August 27, 1939, just a few days before the country invaded Poland.

“Both Germany and Japan had been really getting ready for World War II for about a decade,” says Rob Wallace, the STEM education specialist at The National WWII Museum in New Orleans.

With the onset of the war, the British government developed planes based on Whittle’s designs. The first Allied plane to use jet propulsion took flight on May 15, 1941. Jet planes could go faster than propeller planes, yet also required a lot more fuel and were more difficult to handle. Though they didn’t have an impact on the war (they were still early in their development), jet engines would later transform both military and civilian transportation.

WATCH: Modern Marvels: Jet Engines on HISTORY Vault

4. Blood Plasma Transfusion

ullstein bild/Getty Images
Medics tending to a wounded soldier on D-Day, administer a blood plasma transfusion.

During World War II, a U.S. surgeon named Charles Drew standardized the production of blood plasma for medical use.

“They developed this whole system where they sent two sterile jars, one with water in it and one with freeze-dried blood plasma and they’d mix them together,” Wallace says.

Unlike whole blood, plasma can be given to anyone regardless of a person’s blood type, making it easier to administer on the battlefield.

5. Electronic Computers

SSPL/Getty Images
The women seen here belonged to the Women's Royal Naval Service, (WRNS) October 1943.&nbsp;<em>Colossus was the world's first electronic programmable computer at Bletchley Park in Buckinghamshire, where cryptographers deciphered top-secret military communiques between Hitler and his armed forces.</em>

In the 1940s, the word “computers” referred to people (mostly women) who performed complex calculations by hand. During World War II, the United States began to develop new machines to do calculations for ballistics trajectories, and those who had been doing computations by hand took jobs programming these machines.

READ MORE: When Computer Coding Was a ‘Woman’s’ Job

The programmers who worked on the University of Pennsylvania’s ENIAC machine included Jean Jennings Bartik, who went on to lead the development of computer storage and memory, and Frances Elizabeth “Betty” Holberton, who went on to create the first software application. Lieutenant Grace Hopper (later a U.S. Navy rear admiral) also programmed the Mark I machine at Harvard University during the war, and went on to develop the first computer programming language.

In Britain, Alan Turing invented an electro-mechanical machine called the Bombe that helped break the German Enigma cipher. While not technically what we’d now call a “computer,” the Bombe was a forerunner to the Colossus machines, a series of British electronic computers. During the war, programmers like Dorothy Du Boisson and Elsie Booker used the Colossus machines to break messages encrypted with the German Lorenz cipher.

6. Radar

Time Life Pictures/US Navy/The LIFE Picture Collection/Getty Images
Personnel manning a radar scope during World War II.

The first practical radar system was produced in 1935 by British physicist Sir Robert Watson-Watt, and by 1939 England had built a network of radar stations along its south and east coasts. MIT’s Radiation Laboratory, or “Rad Lab,” played a huge role in advancing radar technology in the 1940s. However, the lab’s original goal was to use electromagnetic radiation as a weapon, not a form of detection.

“Their first idea that they had was that if we could send a beam of electromagnetic energy at a plane, maybe we could kill the pilot by cooking them or something,” Wallace says. “The cooking thing wasn’t working, but they were getting bounce-back that they could receive and they had the idea that they could use electromagnetic radiation just like they used sound radiation in sonar. So they started working on radar.”

Radar helped the Allied forces detect enemy ships and planes. Later, it proved to have many non-military uses, including guiding civilian crafts and detecting major weather events like hurricanes.