ஐ.எஸ்.எஸ்.என்: 1745-7580
Pranav Vadlamudi, Paige Gordon
In 1952, the first immunoglobulin products, made from human plasma, were used to combat infectious diseases such as primary immunodeficiencies during World War II. In addition, during the last 50 years, further research has been conducted to prove whether or not immunoglobulin therapy can truly be effective against primary immunodeficiencies such as X-linked agammaglobulinemia or Common Variable Immunodeficiency Disorder through both intravenous and subcutaneous administration. Intravenous administration has been effective in increasing overall Ig serum concentration in the blood for patients with primary or secondary disorders. However, as more research was conducted, scientists had concluded that the overall cost, maintenance, and at times, lack of efficiency, makes intravenous administration a burden. Thus, scientists have looked for an alternative through subcutaneous administration. For patients with primary immunodeficiencies, subcutaneous has been proven effective in increasing immunoglobulin concentration, even more than intravenous has. The benefit of subcutaneous administration at-home, the low cost, and the heightened efficacy make subcutaneous administration far better than intravenous for primary immunodeficiency patients. However for secondary immunodeficiency patients, the efficacy of subcutaneous administration has not been fully proven and the research is scarce and unreliable. Our literary review explores the advent of immunoglobulin therapy and its past research on both intravenous and subcutaneous administration for primary and secondary immunodeficiency disorders. We sought out to find potential experimental values researchers can conduct experiments to enhance the research on subcutaneous administration for secondary immunodeficiency patients.