Researchers discovered insulin in 1921, but people with diabetes didn’t live for very long before this discovery. This was because doctors had no idea what to do for them that could help them with this condition. The only thing that worked was putting them on a strict diet limiting carbohydrates in order to keep the state maintained, but this could only do so much. This really only gave them a few extra years of life if they didn’t die of starvation first. However, it took two German scientists in 1889 to discover that when they removed the pancreas from a dog, it would start showing signs of diabetes.
They’d found the secret to insulin production within the body and started working on ways to control this. In 1921, a young surgeon named Frederick Banting discovered how to remove insulin from the pancreas and used it to treat patients with diabetes. The first patient was a young 14-year-old boy dying from diabetes; after he received his first injection, his blood glucose levels began to drop to near-normal levels. The medical field moved from insulin from dogs to extract it from pigs and cattle. Nowadays, people use synthetic insulin created in laboratories to help people control their diabetes.
Gene therapy was actually first developed in the 1960s, but it has only come to light recently in the medical field and is being used to help people. Back then, doctors considered that DNA sequences could help patients with certain genetic disorders. In the 1980s, doctors wrote a paper on how a virus could introduce genes into the stem cells for beneficial purposes. Today, we’re able to see a complete blueprint of our own DNA to know which conditions we’re more prevalent to.
Gene therapy first became successful in a four-year-old patient born with severe combined immunodeficiency, which caused her T cells to die off so that she couldn’t fight infections. However, through gene therapy, her body became able to fight these infections off, and she is still alive today. Gene therapy mainly falls into two broad categories: ex vivo and in vivo. Ex vivo is where cells are removed from a patient and used as a vector for new genetic material to be reintroduced. In vivo involves the direct infusion of the vector into the bloodstream or other targeted organs. Keep reading for more medical breakthroughs that changed the healthcare industry.
When experts first developed 3D printing, more people saw it as a creative outlet to create whatever they wanted. But very few people considered the benefits it could bring to the medical field. At first, they considered the process of creating prosthetics for those who were missing limbs. However, technology has expanded much more than that. Labs can now 3D print living tissue, minimizing the need for skin grafts or searching for donors. Called 3D bioprinting, researchers have combined living tissue with biopolymer gels within the body without much risk of rejection.
Organ printing is also in the works, which will reduce wait times for those in immediate need of organ donation. There’s also a reduced risk of rejection; when a person gets an organ transplant, they must go on many medications to reduce their white blood cell production so that their body doesn’t reject the new organ. These drugs also put their body at risk for contracting infections that can be quite deadly. On the other hand, organ printing reduces the need for this kind of risk.
At one time, if something happened to your kidneys or your liver, then that was the end of it. These are organs responsible for filtering out toxins from your blood and body so that you can stay healthy. But when these organs start to malfunction, you can get very, very sick. Without dialysis, you pretty much had a death sentence. Doctors first discussed the concept of dialysis in the 19th century, and it wasn’t until 1913 that they used this idea in the medical field.
During the process, they used anesthetized animals, and they filtered the blood outside of the body through cellulose membranes to filter out certain substances. Machines called “dialyzers” were then built to help patients in the stages of kidney failure. Many of these patients were too far gone in their conditions and didn’t survive. Plus, the machines were not completely effective at cleaning out the blood. But since then, dialysis machines have been constantly built and improved upon until we have the devices that we do today. Now, patients with kidney diseases or malfunction can seek treatment with a dialysis machine that can help them continue living full and healthy lives.
Cancer is one of the biggest killers because there are so many different types. Once you were diagnosed with cancer, there wasn’t much left for you to do back in the day. However, radiation and chemotherapy came onto the scene, which has helped people survive this horrible disease. But both chemo and radiation have some awful side effects that can make a patient feel a lot worse before getting better. That’s when cancer immunotherapy came onto the scene to try and improve what a patient has to go through.
The way chemo and radiation work are that it targets almost all the cells in the body to kill off the cancer cells and prevent them from spreading. On the other hand, cancer immunotherapy stimulates the body’s natural immune system to increase its chances of fighting off the disease. For the time being, CIT treatments are pretty limited in scope because of how complex tumors already are. However, research is constantly looking for ways to overcome these hurdles to prove the outcome of CIT therapies so that people can use them more widely to help more patients combat cancer. Keep reading for more medical breakthroughs that changed the healthcare industry.
5. Vaccines Are Part of Amazing Medical Breakthroughs
The history of modern vaccines began in the late 18th century with the discovery of immunization against smallpox. It started with Doctor Edward Jenner when he took cowpox and inoculated a young boy; as a result, the boy became immune to smallpox. It started the discovery of using weaker versions of viruses in order to help people develop an immunity against the more potent virus. Then in the 20th century, vaccines for polio were developed and helped to change the world’s fight against this horrible infection. Contracting polio could lead to spinal paralysis and muscle weakness.
Since then, researchers have made strides in developments with other vaccines. Some include tetanus, the flu, diptheria, measles, mumps, and rubella, just to name a few. The creation of these vaccines has helped children from young ages develop immunity to these diseases. That way, they have a chance to live well into adulthood. Clinics offer flu shots every year at the beginning of the flu season. They help decrease the risk of infections and help people fight off the symptoms. Thanks to vaccines, it has become more difficult for viruses to spread. Not only that, but vaccines have allowed people worldwide not to contract these deadly diseases.
Have you considered that doctors didn’t discover modern anesthesia until 1846? To think of what patients went through without anesthesia can be scary. So, be thankful that you live in a modern age where you can endure surgical procedures and invasive treatments without having to experience tremendous amounts of pain. Of course, it wasn’t as barbaric as you might think. Anesthesia has been used since the time of the Babylonians and Greeks, while in the 1200s, doctors used sponges soaked with opium and mandrake root. But this only did so much to minimize the pain.
In 1846, a dentist in Boston first used sulfuric ether. Why? To anesthetize a man who needed to remove a tumor from his neck. He started to expose himself and his pets to the fumes. When he was satisfied that it was safe to use, he began using it on his dental patients. Since then, other doctors have created and utilized other substances that provide pain-free experiences to their patients. That way, they can undergo procedures without the stress of pain. These substances included nitrous oxide, chloroform, cocaine, sodium thiopental, and cyclopropane, just to name a few.
Surgeries used to be very open. Surgeons had to cut parts of the body open in order to correct an issue. That sometimes led to massive loss of blood, the risk of infection, and a long recovery time because of how many stitches were needed to close up a patient. However, surgeries are now less invasive, where doctors insert tiny devices through incisions that are only a few inches wide and can still achieve the same results. Specialized equipment has small cameras on the ends of them so that surgeons can see exactly what they’re doing without a patient being fully cut open.
The many benefits of minimally invasive surgeries include less pain and less blood loss (so there’s less need for transfusions). Plus, there is less damage to the surrounding tissues, reduced risk of complications, and a faster recovery time. The less time a patient has to stay in the hospital, the better their chances are to improve when it comes to getting back to their usual routines. And because of the accuracy of these devices, there is less risk. Keep reading for more medical breakthroughs that changed the healthcare industry.
Artificial intelligence was once a thing of science fiction. There are movies about robotic overlords capable of taking over an entire planet, which made people a little afraid to adopt them. But over time, the medical field has recognized the importance of artificial intelligence and has implemented it into its practice. The most common applications include diagnosing patients or improving communication between the physician and the patient. They can transcribe medical documents like prescriptions. Plus, another benefit is remotely treating patients when a doctor is unavailable. Doctors also use artificial intelligence to classify skin cancers so that dermatologists can better treat their patients.
The implementation of artificial intelligence has made diagnosing and treatment of patients happen more quickly and effectively. Patients won’t have to endure a lengthy process of reaching a diagnosis. Instead, artificial intelligence can now narrow down symptoms, make it easier for physicians to spend more time on patients’ charts, and assist with any questions patients may have outside of the office hours. There is the hope that AI will be able to advance the healthcare sector in the future, though there are still many challenges to overcome.
1. The Development of Antibiotics Is One of the Best Medical Breakthroughs
Most people take antibiotics for granted. Before researchers discovered them, biotic infections were quite prevalent, and people easily succumbed to them. Diseases and infections like rheumatic fever, gonorrhea, or pneumonia had no treatment, so that patients would die from them. It wasn’t until Alexander Fleming discovered penicillin and started putting it to good use in 1928 that people finally understood that they could actually live healthier lives. He examined his Petri dishes with colonies of Staphylococcus and found that one of the dishes was absent of bacteria where a spot of mold was growing.
Fleming later discovered that this mold was Penicillium notatum killed a wide range of harmful bacteria. He then got his assistants to help him isolate this mold and extract the penicillin. That way, he could use it for therapeutic purposes. Nine years later, it was then turned into a life-saving drug by a number of Oxford University colleagues. They worked on animals first through a series of clinical trials before they started implementing it with humans, using milk churns and bedpans to grow their cultures. Since discovering penicillin, experts have created hundreds of antibiotics to treat and counteract other bacterial infections and diseases.