Pharmacophore an International Research Journal
Pharmacophore
Submit Manuscript
Open Access | Published: 2021 - Issue 2

COMPUTER-ASSISTED SURGERY: VIRTUAL- AND AUGMENTED-REALITY DISPLAYS FOR NAVIGATION DURING PLANNING AND PERFORMING SURGERY ON LARGE JOINTS

Artem Evgenevich Mishvelov1*, Abdurakhman Khasbulaevich Ibragimov2, Ismail Tyurshievich Amaliev3, Akhmed Abdullaevich Esuev3, Oleg Valerievich Remizov4, Marina Anatolievna Dzyuba5, Alexander Nikolaevich Simonov6, Anastasiya Isaakovna Okolelova7, Sergey Nikolaevich Povetkin8

 

  1. Laboratory of 3D Technologies, Stavropol State Medical University, Stavropol, Russia.
  2. Faculty of Pediatric, Dagestan state medical university, Makhachkala, Russia.  
  3. Faculty of General Medicine, Chechen State University, Grozny, Chechen Republic, Russia.
  4. UNESCO Department “Health education for sustainable development”, North Ossetian State Academy, Republic of North Ossetia-Alania, Russia.
  5. Rectorate of Essentuki Institute of Management Business and Law, Essentuki, Russia.
  6. Department of Epizootology and Microbiology, Faculty of Veterinary Medicine, Stavropol State Agrarian University, Stavropol, Russia.
  7. Department of Anatomy, Veterinary Obstetrics and Surgery, Faculty of Veterinary Medicine, Kuban State Agrarian University, Krasnodar, Russia.
  8. Department of Food Technology and Engineering, Institute of Life Systems, North Caucasus Federal University, Stavropol, Russia.

ABSTRACT

Today, a small number of full-fledged multifunctional CT systems are used for planning intraoperative intervention using mixed reality, for example, in cardiovascular surgery and urology, especially in Oncology. Surgeons today need a wide field of view to operate. Currently, the use of technologies like the DaVinci robot surgeon is constrained by the bulkiness of software and hardware solutions, the lack of trained specialists, and the high cost of equipment. There is no technology for navigating the course of the operation, with a layer of mixed reality being applied to the patient during the operation, so that the surgeon can use it to track the position of the organ, organ systems, and surgical instruments in real-time. The developed software prototype allows you to create three-dimensional models of internal organs based on computer and magnetic resonance imaging images with the possibility of simulating surgical intervention. Augmented reality glasses are also used to practice the skills of operations on a virtual patient using phantom dummies, linking surgical instruments to holography, as well as fully combining the simulation of the preoperative period with a real patient in real-time.

Keywords: Surgical intervention, Virtual reality, Augmented reality, Robotic-assisted surgery, UNESCO


Introduction

Augmented Reality (AR) is a powerful tool in the medical field, where it allows offering more patient information to the physician by including relevant clinical data in the sight between him and the patient. This medical information can be obtained from imaging studies of the patient with the usage of computed tomography (CT), magnetic resonance (MR), positron emission tomography (PET), which can be displayed overlaid on the physical world, enabling user interaction and manipulation [1-4].

Patients with complex pathology are prescribed examinations based on digital technologies for diagnosis. To date, these methods of examination include computer tomography and magnetic resonance imaging [5-7].

Modern software for processing images obtained during CT and MRI is supplied with these devices. The functions of these programs are sometimes not enough to perform more complex work with the obtained data, including differentiation of tumors in various diseases [8-10]. The creation of 3D models of organs on these programs is quite successful, but it requires a little more time and additional resources [11-13]. Also, the software used is not able to create 3D models of the internal structure of organs if the images were obtained without the use of contrast agents [14-17]. Also, there are a very limited number of programs (Amira for Life Sciences - Germany, UNIM - Russia) for viewing, describing, and reconstructing DICOM images obtained from CT, MRI, for diagnostics and treatment planning using neural networks. One such software is Myrian (Intrasense (France) - a multi-modal solution for viewing and post-processing medical images. Myrian includes easy-to-use but powerful specialized packages for colonoscopy, liver, lungs, blood vessels, orthopedics, and others. Alternative software (3DimViewer, RadiantDICOM, etc.) either does not have the necessary functionality, or is too difficult to use in medical institutions – they do not have a single modular complex that can perform recognition, differentiation of tumors, viewing reconstructed 3D models on surgical monitors, an intuitive and intuitive interface, and there is no possibility of using augmented reality glasses for doctors of various specialties [18-20]. Andrén et al. (2006) showed that three-dimensional reconstruction of the pancreas with semi-automatic segmentation can be created in a similar way to liver imaging [21]. There is also an analog of the HoloLens glasses surgical procedure (Scopis, Germany) developed for spinal surgery [22-24]. During surgery, the holographic navigation platform projects elements of mixed reality onto the patient through HoloLens glasses. The new technology will allow surgeons to correctly install elements, in particular, for transpedicular fixation of the spine. The essence of transpedicular fixation is that a titanium screw is inserted into the vertebra; this method is used in the treatment of complex injuries [25, 26]. Also, the use of HoloLens will help to reduce the time of operations, as well as reduce radiation exposure during X-rays of the spine [23, 27, 28].

An analog for processing CT and MRI data is the Vitrea2 workstation (hardware and software complex). Unlike the Amira complex, the Vitrea 2 complex costs more, due to the delivery of tomographs in the complete package. The Vitrea2 workstation provides improved 2D, 3D, and 4D visualization and analysis of medical images during the daily routine work of the radiologist. The development of Vitrea2 software, which is fully focused on clinical needs, allows you to get a fast, intuitive clinical tool. Vitrea 2 includes a graphical interface and built-in automation of the clinical process, giving the user the ability to prepare a patient report in minutes. The processing speed of the Vitrea 2 software allows clinicians to control 3D volume in an interactive mode and perform virtual viewing inside and outside of anatomical areas of interest in real-time [29-31].

Free software (3DimViewer, RadiantDICOM, etc.) either does not have the necessary functionality, or is too difficult to use in medical institutions - they do not have a single modular complex that can perform recognition, differentiation of tumors, viewing reconstructed 3D models on surgical monitors, an intuitive and intuitive interface, and there is no possibility of using augmented reality glasses for doctors of various specialties. However, we previously developed the HoloWiver module, which allows you to view medical data in HoloLens glasses [32-34].

As a result of our research, a relatively small amount of research was revealed by scientists on the topic of planning and navigating the course of surgery in real-time, using combined CT and ultrasound systems, and using avatar technologies in telemedicine using mixed reality technologies [35-37]. Analysis of the literature allowed us to identify the main shortcomings in this direction:

  1. Lack of information about the use of AR technology and holographic augmented reality glasses to combine simulation and reconstruction on a real patient in real-time;
  2. Lack of programs on the Russian market that allow you to plan and track treatment results based on objective indicators, including using augmented reality glasses.

The solution to these problems is associated with the creation of a new type of simulator and a system for planning and navigating surgical interventions. The purpose of this work was to test in practice the methodology developed by us for planning and performing surgical interventions on the skeleton with the assisted technology of additional reality.

Materials and Methods

The method developed by us relates to medical equipment and software, namely, to the means of preparing and performing a surgical operation using augmented reality glasses as an assistant to figsurgeons [32, 35, 38, 39]. The method includes the stages of planning the surgical intervention, introducing and obtaining the necessary data during the surgical intervention.

Our proposed method allows you to perform preoperative planning using original software and additional reality glasses (additional reality complex). During the operation, the additional reality complex assists, providing access to data from additional methods of examination of the patient and performing "overlay" on the operating field in additional reality glasses of normal anatomy or previously performed MRI or CT data.

The method includes the stages of planning, surgery, surgical navigation in real-time using augmented reality, and at the stage of planning surgery, the patient undergoes a computer tomography or magnetic resonance and form-based DICOM files, 3-4 separate three-dimensional models (directly to the bone, her blood vessels, ligaments, pins, etc.) on the computer in the program. The resulting multi-layer model is placed in a simulator program, with the addition of models of the body surface and large vessels. The method developed by us allows planning the course of surgery on the human skeleton and the organs of the lumbar region. The surgeon works with 3D models of bones in the form of holography with the ability to view the anatomical structure of the bone in the form of a 3D model obtained after CT. Thus, the surgeon plans the course of the operation and creates holography of combining simulations and reconstructions of clinical 3D models on a real patient in real-time with the ability to view the medical history and DICOM images.

3D images in the system are created by MRI scanning and computed tomography of internal organs, while the program itself selects a specific color for each of the organs (after DICOM reconstruction). The resulting image is transmitted to the HoloLens augmented reality glasses. Preoperative planning begins with a CT scan of the patient, after which the exact structure of the organ appears for submission to the CT planning system. The CT image was taken earlier (in 1 hour) is combined with the CT image obtained in real-time of the patient. This module will allow radiologists to reduce the time for processing and describing medical images.

Microsoft uses the system to apply a layer of mixed reality to the patient during surgery. The surgeon can use it to track the position of an organ, organ systems, and surgical instruments in real-time. The program simulates manipulations on the created 3D image of a real patient using surgical instruments in the corresponding specialty. This system is fully adapted for the use of augmented reality glasses (HoloLens mixed reality glasses) for practicing, combining, and simulating various manipulations in surgery, which will allow you to project virtual organs on the patient's body.

Using gestures, the medical specialist can point to the desired organ, and also remove it from the illustration. The image is then connected to mixed reality glasses, and the doctor can see a virtual 3D map of a person's internal organs directly on their body. The system interacts with the surgical dummy (if necessary, special markers are pasted for the operation of the simulator program using mixed reality-holography), using previously taken medical data of CT and MRI.

The developed method was tested at the Stavropol regional clinical hospital (1 Semashko Street, Stavropol, Russia). The method was tested in traumatology (osteology) using augmented reality in 2020 (Figure 1).

 

Figure 1. Performing surgical interventions on the kneecap with the assisted technology of additional reality based on the Stavropol regional clinical hospital.

Results and Discussion

Clinical Case

In 2020, the Stavropol regional clinical hospital for the first time performed a surgical operation using augmented reality glasses. Modern technologies have helped speed up the implementation of autoplasty (surgery method). The anatomical Atlas and the system of interaction with augmented reality glasses HoloLens were tested during several real operations on the knee joint performed in the Traumatological and orthopedic Department of the Stavropol regional clinical hospital.

We selected the first patient based on the degree of complexity of the operation, namely, the patient was injured during a football match. The patient had damaged ligaments in the knee joint (cruciate ligaments of the knee joint), and also had a small crack in the tibia (Figure 2). they Used HoloLens augmented reality glasses for navigation of the surgical intervention, which reduced the operation time by 28 minutes, instead of 2 hours they spent for 1 hour and 32 minutes. This technology allows you to apply computer-generated visual objects (in this case, a 3D model of the tibia, knee joint, pins, blood vessels) and add-ons to the existing objective reality.

The software allowed us to display everything that the surgeon needs in preoperative planning. Right during the operation, the surgeon received the results of all the studies, images, MRI, and CT data, which supplemented the operation process.

 

a)

b)

c)

Figure 2. Demonstration of 3D reconstruction of the fibula, tibia, and patella with DICOM images of the knee joint in augmented reality glasses.

 

New software HoloDoctor.Orto expands the boundaries for training, it shows the doctor's work through his own eyes, allows you to get a hint from a more experienced colleague in real-time right in the operating room. Before HoloDoctor.Orto realize, they used Hololens augmented reality glasses to describe medical images, practice, and combine 3D models.

 

Figure 3. Circuit training for surgery.

 

The resulting 3D models of the leg from the 3D scanner are combined with the anatomical model.

 

Clinical Case

Computed tomography of the lumbosacral spine.

On a series of tomograms and multiplanar reconstructions of segments Ll-Sl of the spine, posterior fusion is determined by metal plates with transpedicular fixation with screws in the vertebral bodies Thll-LZ. The standing of the metal structure is correct. In the anamnesis, surgical treatment -removal of half-vertebrae THB, Thll, Thl2.The left transverse process of the L5 vertebra is expanded, forming an articulation with the adjacent parts of the sacrum. The shape and proportions of the vertebrae are not changed. The height of the disks is reduced. Disc tissues L2-L3, L3-L4, L4-L5 circularly stand beyond the bony borders of the vertebral bodies to 3,4-3,5-5, SMM, respectively, moderately compressing the dural SAC. The closing plates of the bodies are sealed, with smooth contours. Marginal bone growths are determined along the edges of adjacent closure sites of vertebral bodies (Figure 3). The facets of the articular processes are sclerosed, with marginal bone growths. The lumen of the spinal canal is not changed. The dural SAC and epidural tissue are differentiated.

The result of the study: an anamnesis of surgical treatment-posterior spondylodesis with transpedicular fixation. CT shows signs of dystrophic changes in the lumbar spine. Circular protrusion of L2-L3, L3 - L4, L4-L5 disks. The left-sided sacralization of the L5 vertebra.

A neurologist together with a radiologist in the Department of Radiology and the Department of neurology could consider pathological signs - short circuits platforms of the vertebral bodies, with bone growth boundary with glasses augmented reality HoloLens and PC. Doctors worked with medical data in the form of DICOM images, medical histories, and a 3D model of the spine with a straightening plate. The radiologist described the clinical case in glasses using DICOM images, measured the size of marginal bone growths, vertebral bodies, damaged intervertebral discs, and uploaded images with a history to the PACS server. Further, all the described studies on the patient were transferred to the neurologist and surgeon for further treatment. The radiologist and neurologist are satisfied with the result of using HoloLens glasses with the developed HoloDoctor program. Also, this clinical case was described and worked out on a PC using our program. The time is taken by the radiologist to describe the clinical case was less than 15 minutes in HoloLens glasses, and 20 minutes on the computer.

The results of the work showed that using the software package, it is possible to work out and optimize the technological cycle of planning and navigation of a surgical operation based on the following algorithm:

  1. Creating a pipeline system for digital processing of DICOM images, which allows performing reconstructions of organs with CT, MRI, then the resulting 3D models of organ systems or a separate organ after digital processing are uploaded to the PACS server or the surgical intervention simulator for further use.
  2. Upload to the graphics station from the PACS server on the arm or to the program in HoloLens glasses with the patient's medical history, CT, MRI data, and 3D reconstruction.
  3. Save the finished scenario as a 3D model or video recording to the PACS server.

Conclusion

When studying the possibilities in traumatology (osteology) using augmented reality, all the tasks were solved. Clinical cases were processed and integrated with HoloLens augmented reality glasses to describe medical images, practice, and combine 3D models.

The original software together with additional reality glasses (additional reality complex) allows you to create models of various surgical pathologies based on DICOM files obtained during CT or MRI studies and stored in databases. This allows you to model an infinite variety of clinical cases and to provide multi-faceted training for doctors of surgical specialties.

Based on the obtained DICOM CT or MRI files, the original software allows you to simulate a specific clinical case. The teaching method consists of using a complex of additional reality by the teacher and students. Students, being in a simulated clinical situation, perform surgical manipulations. The teacher monitors the implementation of a given case. This technique allows you to practice surgical interventions both individually and in groups of different levels of training without involving expensive models.

It is also possible to use an additional reality system as a "surgeon's eye". This technique allows students to avoid being in the surgical hall, which reduces the bacterial load on the operating room and eliminates the possibility of a biological accident, observes the smallest nuances of surgery, and receive comments from the leading surgeon. The technique allows you to broadcast in real-time and records the most interesting clinical cases.

The complex of additional reality can serve as a means of telemedicine, during the consultation of patients or performing surgery.

The educational process is provided by the use of a holographic training simulator concerning the phantoms or mannequins. In response to the cadets ' actions, not only the patient's physiological parameters will automatically change, but also intraoperative endosurgical, angiographic and ultrasound images.

Acknowledgments: The authors express their gratitude to Dr. Igor Spartakovich Baklanov for supervising the project and reviewing and editing the article.

Conflict of interest: None

Financial support: None

Ethics statement: None

References

1.        Siddiqui SA, Ahmad A. Dynamic analysis of an observation tower subjected to wind loads using ANSYS. In: Proceedings of the 2nd International Conference on Computation, Automation and Knowledge Management (ICCAKM) [conference proceedings on the Internet]; 2021:19-21; Dubai, United Arab Emirates. United Arab Emirates: IEEE; 2021 [cited 2021 Jan 19]. p. 6-11. Available from: IEEE Explore

2.        Siddiqui SA, Ahmad A. Implementation of Thin-Walled Approximation to Evaluate Properties of Complex Steel Sections Using C++. SN Comput Sci. 2020;1(342):1-11. Available from: https://link.springer.com/article/10.1007/s42979-020-00354-1. doi:10.1007/s42979-020-00354-1

3.        Moreta-Martinez R, García-Mato D, García-Sevilla M, Pérez-Mañanes R, Calvo-Haro J, Pascau J. Augmented reality in computer-assisted interventions based on patient-specific 3D printed reference. Healthc Technol Lett. 2018;5(5):162-6.

4.        Blinov AV, Siddiqui SA, Nagdalian AA, Blinova AA, Gvozdenko AA, Raffa VV, et al. Investigation of the influence of Zinc-containing compounds on the components of the colloidal phase of milk. Arab J Chem. 2021;14(7):103229.

5.        Bledzhyants GA, Mishvelov AE, Nuzhnaya KV, Anfinogenova OI, Isakova JA, Melkonyan RS, et al. The effectiveness of the medical decision-making support system "electronic clinical pharmacologist" in the management of patients therapeutic profile. Pharmacophore. 2019;10(2):76-81.

6.        Demchenkov EL, Nagdalian AA, Budkevich RO, Oboturova NP, Okolelova AI. Usage of atomic force microscopy for detection of the damaging effect of CdCl2 on red blood cells membrane. Ecotoxicol Environ Saf. 2021;208:111683.

7.        Barabanov PV, Gerasimov AV, Blinov AV, Kravtsov AA, Kravtsov VA. Influence of nanosilver on the efficiency of Pisum sativum crops germination. Ecotoxicol Environ Saf. 2018;147:715-9. doi:10.1016/j.ecoenv.2017.09.024

8.        Osipchuk GV, Povetkin SN, Ashotovich A, Nagdalian IA, Rodin MI, Vladimirovna I, et al. The Issue of Therapy Postpartum Endometritis in Sows Using Environmentally Friendly Remedies. Pharmacophore. 2019;10(2):82-4.

9.        Kenijz NV, Koshchaev AG, Nesterenko AA, Omarov RS, Shlykov SN. Study the effect of cryoprotectants on the activity of yeast cells and the moisture state in dough. Res J Pharm Biol Chem Sci. 2018;9(6):1789-96.

10.     Morozov VYu, Kolesnikov RO, Chernikov AN. Effect from Aerosol Readjustment Air Environment on Productivity and Biochemical Blood Parameters of Young Sheep. Res J Pharm Biol Chem Sci. 2017;8(6):509-14.

11.     Blinov AV, Yasnaya MA, Blinova AA, Shevchenko IM, Momot EV, Gvozdenko AA, et al. Computer quantum-chemical simulation of polymeric stabilization of silver nanoparticles. Phys Chem Aspects Study Clusters Nanostruct Nanomater. 2019;11:414-21.

12.     Nagdalian AA, Rzhepakovsky IV, Siddiqui SA, Piskov SI, Oboturova NP, Timchenko LD, et al. Analysis of the content of mechanically separated poultry meat in sausage using computing microtomography. J Food Compost Anal. 2021;100:103918. doi:10.1016/j.jfca.2021.103918

13.     Nesterenko A, Koshchaev A, Kenijz N, Akopyan K, Rebezov M, Okuskhanova E. Biomodification of meat for improving functional-technological properties of minced meat. Res J Pharm Biol Chem Sci. 2018;9(6):95-105.

14.     Azuma RT. A survey of augmented reality. Presence (Camb). 1997;6(4):355-85.

15.     Burbano A. 3D Cameras Benchmark for Human Tracking in Hybrid Distributed Smart Camera Networks. Proceedings of the 10th International Conference on Distributed Smart Camera (ICDSC '16). 2016:76-83.

16.     Blinov AV, Kravtsov AA, Krandievskii SO, Timchenko V, Gvozdenko AA, Blinova A. Synthesis of MnO2 nanoparticles stabilized by Methionine. Russ J Gen Chem. 2020;90(2):283-6.

17.     Salins SS, Siddiqui SA, Reddy SVK, Kumar S. Parametric Analysis for Varying Packing Materials and Water Temperatures in a Humidifier. In: Proceedings of the 7th International Conference on Fluid Flow, Heat and Mass Transfer (FFHMT’20) [Conference proceedings on the Internet]; 2020 Nov 15-17; Niagara Falls, Canada. Canada: FFHMT; 2020:196(1)-196(11). Available from: FFHMT

18.     Bernhardt S. The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal. 2017;37:66-90.

19.     Nagdalian AA, Pushkin SV, Povetkin SN. Migalomorphic spiders venom: extraction and investigation of biological activity. Entomol Appl Sci Lett. 2018;5(3):60-70.

20.     Luneva A, Koshchayev A, Nesterenko A, Volobueva E, Boyko A. Probiotic potential of microorganisms obtained from the intestines of wild birds. Int Trans J Eng Manag Appl Sci Technol. 2020;11(12):11A12E.

21.     Andrén O, Fall K, Franzén L, Andersson SO, Johansson JE, Rubin MA. How well does the Gleason score predict prostate cancer death? A 20year follow-up of a population-based cohort in Sweden. J Urol. 2006;175(4):1337-40. doi:10.1016/S00225347(05)007342

22.     Qian L, Wu JY, DiMaio SP, Navab N, Kazanzides P. A review of augmented reality in robotic-assisted surgery. IEEE Trans Med Robot Bionics. 2019;2(1):1-6. doi:10.1109/TMRB.2019.2957061

23.     Hite GJ, Mishvelov AE, Melchenko EA, Vlasov АА, Anfinogenova OI, Nuzhnaya CV, et al. Holodoctor planning software real-time surgical intervention. Pharmacophore. 2019;10(3):57-60.

24.     Lopteva MS, Povetkin SN, Pushkin SV, Nagdalian AA. 5% Suspension of Albendazole Echinacea Magenta (Echinacea Purpurea) Toxicometric Evaluation. Entomol Appl Sci Lett. 2018;5(4):30-4.

25.     Pessaux P, Diana M, Soler L, Piardi T, Mutter D, Marescaux J. Towards cybernetic surgery: robotic and augmented reality-assisted liver segmentectomy. Langenbecks Arch Surg. 2015;400(3):381-5. doi:10.1007/s00423-014-1256-9

26.     Nesterenko AA, Koshchaev AG, Kenijz NV, Shhalahov DS, Vilts KR. Development of device for electromagnetic treatment of raw meat and starter cultures. Res J Pharm Biol Chem Sci. 2017;8(1):1080-5.

27.     Iyer RK, Chiu LL, Vunjak-Novakovic G, Radisic M. Biofabrication enables efficient interrogation and optimization of sequential culture of endothelial cells, fibroblasts and cardiomyocytes for formation of vascular cords in cardiac tissue engineering. Biofabrication. 2012;4(3):035002.

28.     Lunin LS, Lunina ML, Kravtsov AA, Blinov AV. Effect of the Ag Nanoparticle Concentration in TiO2–Ag Functional Coatings on the Characteristics of GaInP/GaAs/Ge Photoconverters. Semiconductors. 2018;52(8):993-6. doi:10.1134/S1063782618080122

29.     Nguyen NQ, Ramjist JM, Jivraj J, Jakubovic R, Deorajh R, Yang VX. Preliminary development of augmented reality systems for spinal surgery. InClinical and Translational Neurophotonics. Int Soc Opt Photonics. 2017;10050:100500K.

30.     Nuzhnaya KV, Mishvelov AE, Osadchiy SS, Tsoma MV, Slanova RH, ‎Kurbanova A‎M, et al. Computer Simulation and Navigation in Surgical ‎Operations‎. Pharmacophore. 2019;10(4):46-52.

31.     Oboturova NP, Nagdalian AA, Povetkin SN. Adaptogens instead restricted drugs research for an alternative Items to doping in sport. Res J Pharm Biol Chem Sci. 2018;9(2):1111-6.

32.     van Oosterom MN, van der Poel HG, Navab N, van de Velde CJ, van Leeuwen FW. Computer-assisted surgery: virtual-and augmented-reality displays for navigation during urological interventions. Curr Opin Urol. 2018;28(2):205-13. doi:10.1097/MOU.0000000000000478

33.     Saleeva IP, Morozov VYu, Kolesnikov RO. Disinfectants effect on microbial cell. Res J Pharm Biol Chem Sci. 2018;9(4):676-81.

34.     Selimov MA, Nagdalian AA, Povetkin SN, Statsenko EN, Kulumbekova IR, Kulumbekov GR, et al. Investigation of CdCl2 Influence on red blood cell morphology. Int J Pharm Phytopharmacol Res. 2019;9(5):8-13.

35.     Vávra P, Roman J, Zonča P, Ihnát P, Němec M, Kumar J, et al. Recent development of augmented reality in surgery: a review. J Healthc Eng. 2017;2017:1-9. doi:10.1155/2017/4574172

36.     Salins SS, Siddiqui SA, Reddy SVK, Kumar S. Experimental investigation on the performance parameters of a helical coil dehumidifier test rig. Energ Source Part A. 2021;43(1):35-53. Available from: https://www.tandfonline.com/doi/full/10.1080/15567036.2020.1814455

37.     Sizonenko MN, Timchenko LD, Rzhepakovskiy IV, DA SP AV, Nagdalian AA, Simonov AN, et al. The New Efficiency of the «Srmp»–Listerias Growth-Promoting Factor during Factory Cultivation. Pharmacophore. 2019;10(2):85-8.

38.     Alameri M, Sulaiman SA, Ashour A, Al-Saati MA. Knowledge and Attitudes of Venous Thromboembolism for Surgeons in Two Saudi Arabian Medical Centers. Arch Pharm Pract. 2019;10(3):107-11.

39.     Emeje IP, Onyenekwe CC, Ukibe NR, Ahaneku JE. Prospective Evaluation of p24 Antigen and HIV-1 Protease Assays at 6 Months and 12 Months Initiation of Antiretroviral Therapy in HIV Infected Participants at Federal Medical Center, Lokoja, Nigeria. Arch Pharm Pract. 2019;10(2):72-80.

QR code:

Short Link:
Quick Access

Pharmacophore
ISSN: 2229-5402

Pharmacophore
© 2024 All rights reserved
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.