【Dialogue】 Exploring Data Science for the Future of Digital Healthcare

Interviewer: Mr. Tsuguchika Kaminuma (Click to view profile)
Interviewee: Mr. Hisashi Iwase (Click to view profile)

Introduction

In recent years, particularly in the past few, ICT technologies have advanced rapidly, seemingly bringing significant changes to the overall structure of society. Many people may now be beginning to realize that these changes are also impacting health and medical services. The term “AI” has become ubiquitous, and it seems that everything is being labeled as AI. However, in reality, there are still very few who truly understand AI and are accurately designing its applications and plans. Additionally, the differences between Japan and other countries appear to be significant. Medicine, as an area that exists after a doctor diagnoses a disease, includes drug administration as part of medical practice. The concept of “medicine” itself has the potential to undergo significant changes due to recent scientific advances. Furthermore, we seem to have entered a new world where it is difficult to define the boundaries of medicine, such as how far regenerative medicine and cell therapy fall within its scope. In this new era, nurturing talent in the field of computer science—a new world that utilizes data to advance science—seems crucial. In Japan, the question arises of how to train such data scientists, create an environment that enables them to participate in social transformations and various businesses, and help them thrive globally. For this discussion, we spoke with Dr. Kaminuma, who has been engaged for many years in fields such as medical care, life sciences, drug discovery, and environmental chemical substances from the perspectives of computational science and informatics.

The Dawn of Artificial Intelligence and Pattern Recognition

Mr. Iwase
Dr. Kaminuma, you have observed the development of informatics and computational applications both domestically and internationally over many years. From Japan’s period of rapid economic growth, through the bubble economy and its collapse, into the new century, you have had valuable experiences. Based on your extensive involvement in the early stages of informatics and the broad application of computers in biomedicine and drug development, how do you perceive the current state of Japan?
Dr. Kaminuma
I graduated from university in 1964, the year of the previous Tokyo Olympics. That summer, I entered a graduate program in the United States, and in the summer of 1971, I returned to Japan and joined a research institute at a Japanese company. While in the U.S., I worked under Professor Satoshi Watanabe at the University of Hawaii, who had transitioned from theoretical physics to pattern recognition. I assisted with his work and received a salary for it1). Although I earned my degree in theoretical physics, much of my work focused on pattern recognition. One of my labmates in the late 1960s was C. A. Kulikowski, who obtained his degree by applying pattern recognition to thyroid disorders and later joined the Artificial Intelligence Department at Rutgers University. My work involved a variety of topics, including machine learning techniques, applied research, and the development of imaging devices using laser light.

Upon returning to Japan in the summer of 1971, I joined Hitachi’s newly established information systems research institute. There, I became involved in a project to enable computers to perform precise differential diagnoses of heart diseases, collaborating with Dr. Kiyoshi Machii (then at Mitsui Memorial Hospital, later a professor at Toho University Medical School) and young researchers at the institute. This effort involved transferring the experiential knowledge of human specialists to computers to automate the process of enhancing diagnostic accuracy—what we now call an expert system. This work took place in the early 1970s. Additionally, we explored the use of R. Bellman’s dynamic programming to optimize drug administration based on data from gout patients accumulated at Toranomon Hospital. Incidentally, the term “expert system” was coined by Kulikowski. After moving to Rutgers University, he developed a system using expert knowledge for glaucoma diagnosis, which later became part of Japan’s so-called Fifth Generation Computer initiative.

In 1976, I joined the newly established Tokyo Metropolitan Institute of Medical Science. There, I aimed to create an environment capable of handling all types of medical data, including documents, numerical data, waveforms, images, 3D images, and maps—a platform-like system in today’s terms2). By the late 1970s, I had established a Medical Informatics group focused on developing statistical packages, though we did not engage in clinical diagnostic research. However, I became acquainted with researchers like E. H. Shortliffe, who developed MYCIN, an interactive system supporting antibiotic usage at Stanford, and P. Szolovits, who conducted clinical application research at MIT’s Artificial Intelligence Laboratory. These young researchers, in their early 30s at the time, formed the foundation of medical AI groups in the U.S., which eventually led to the creation of AMIA4). This movement resonated strongly with Dr. Shigeyoshi Kaihara, then an assistant professor at the University of Tokyo Hospital’s Information Processing Department. Dr. Kaihara, with some support from me, leveraged a series of lectures by these young U.S. researchers to establish the Japan Association for Medical Informatics in 1983. However, our research group has since had limited involvement in clinical diagnostic research.

The Dynamic and Ambitious Japan of the 1980s

Mr. Iwase
Dr. Kaminuma, I believe you founded the CBI Society in the 1980s. It feels like many new systems driving significant transformations were conceptualized during that era. Were there any keywords from the 1980s that anticipated the current state of affairs?
Dr. Kaminuma
In the 1980s, while leading a research group in Bioinformatics, I also contributed to establishing a Clinical Epidemiology group. In 1981, we launched the precursor to today’s CBI (Chem-Bio Informatics) Society, based on work that transferred the core elements of the NIH and EPA’s Chemical Information System to our institution. From 1983, we introduced experimental equipment such as clean benches and optical microscopes into our computational facilities and began conducting experiments, including observing embryogenesis using *C. elegans* and performing compound screenings. This involvement connected me to various projects and trends of the 1980s, such as the Ministry of International Trade and Industry’s (now METI) Fifth-Generation Computer Project, molecular and bio devices (and subsequently bio-computers), and neural network initiatives, some of which I supported.

The Fifth-Generation Computer Project is well known for its goal of creating an implementation environment for expert systems. However, I advocated a “semi-empirical methodology,” suggesting that “directly inputting the knowledge of specialists into a computer cannot yield reliable results. Such knowledge systems should serve as a starting point for improvement through learning from real-world data in practical settings.”

Today’s CBI Society is highly focused on drug discovery, but its initial goal was to integrate systems for compound databases and computational chemistry. Toward the late 1980s, I moved to the National Institute of Health Sciences, where I worked on establishing the newly accessible internet infrastructure, launching projects like the “Global Information Network on Chemicals (GINC)” in collaboration with international organizations such as WHO and UNEP, and addressing emerging issues like microplastic debris in oceans.

Since my retirement in 2001, I have stepped away from active research. Reflecting on that era, the period when I was engaged in research marked a time when Japan’s science, technology, and corporate activities (economy) were truly on the rise. Some researchers at institute meetings even declared that “there’s nothing to learn from America.”

Mr. Iwase
The 1990s marked a turning point when Japan’s momentum began to wane, followed by the collapse of the economic bubble and the explosion of the internet.
Dr. Kaminuma
Indeed, Japan was vibrant in the 1980s. However, some argue that the U.S. launched efforts to curb Japan’s rise, starting with the 1985 Plaza Accord. This is often seen as the beginning of Japan’s international stagnation. Domestically, the introduction of the consumption tax in the late 1990s is viewed as a self-inflicted setback. Yet, rather than saying Japan declined during the Heisei era, it might be more accurate to say that other countries advanced. Singapore and China, in particular, made remarkable progress, largely driven by their response to the opening of the internet in 1994.

In 1992, as a WHO consultant, I spent a week in Singapore visiting agencies involved in chemical management, and I was struck by the experience. Singapore was ambitiously aiming to establish an electronic government by 2000. Around the same time, at a WHO conference in Malaysia, I was surprised to see participants from various countries, including China, Hong Kong, Canada, and Fiji, all of whom were ethnically Chinese.

Inspired by the introduction of internet and WWW environments to my research institute and the CBI Society, I felt compelled to write a book to highlight the opportunities and warn of the challenges of this era. This resulted in my book *The Third Opening of Japan: The Shock of the Internet* (Nijiman Kaminuma, Kinokuniya, 1994). Unfortunately, it did not receive the attention I had hoped for. I believe the theme of “The Third Opening of Japan” diluted the urgency of the message that Japan needed to address the internet wave immediately. The goal should have been not an “opening” but a “reformation.”

Subsequently, the nations and companies that embraced the internet wave experienced significant growth. In this sense, the 1990s was a major turning point, and the defining keyword or zeitgeist of the era was undoubtedly the response to the (opened) internet.

Understanding Data Science

Mr. Iwase
The term “data science” is quite broad and seems difficult for the general public to understand. I believe this is because it’s not clearly explained which parts of the data are applied to what kind of applications. Is it possible to explain this in a way that’s easy for the general audience to understand?
Dr. Kaminuma
It’s a field where generalizations are challenging, but in areas I’ve been somewhat involved with—such as natural sciences, health and medical care, drug discovery, and environmental science research and development—there are frameworks that can be made more accessible.

For instance, in the context of medical diagnosis, doctors constantly make decisions. The foundational knowledge, information, and materials they rely on are diverse, including language (descriptions, natural language), waveforms, images (photographs), and observations of movements. Today, many of these elements can be handled digitally, though not completely. At a basic level, they encompass language, waveforms, diagrams, and images, including videos. This is a classification by data format.

Another classification pertains to data associated with domain-specific knowledge. For example, molecular spectra and structural data related to molecular interactions are vital to chemists; genomic sequences matter to geneticists and biologists; and images of malignant skin tumors are tied to the expertise of pathologists. Chemists may see proteins as large molecules, while biologists or informaticians regard genes as nucleotide sequences (codes).

There is also a cognitive purpose-based classification, including deduction (reasoning), induction, creativity (inspiration), and planning techniques for goal achievement. For instance, the first data discrimination (classification) technique I worked on in the 1960s, which would now be called an SVM (Support Vector Machine), was related to quadratic programming methods. However, its use was primarily in pattern classification and inductive reasoning (inference or conjecture). In other words, cognitive computational techniques (AI) are intricately nested within these domains.

My classification is experiential, and I am not aware of others’ perspectives. Given the current AI boom, I think there’s a need to explain these data science techniques and their classifications in simple general terms. However, to be honest, it’s a formidable challenge.

Figure 1: Computational thinking techniques. In scientific research (D2K Science) that generates knowledge from data, thinking relies on a combination of universal theories and techniques with domain-specific knowledge, theories, and techniques. Data is derived from searches of existing databases, measurements and investigations in laboratories, and records from medical services (clinical records). While computers cannot generate ideas themselves, they can assist experts in generating ideas through data visualization.


Figure 1: Computational thinking techniques. In scientific research (D2K Science) that generates knowledge from data, thinking relies on a combination of universal theories and techniques with domain-specific knowledge, theories, and techniques. Data is derived from searches of existing databases, measurements and investigations in laboratories, and records from medical services (clinical records). While computers cannot generate ideas themselves, they can assist experts in generating ideas through data visualization.

Mr. Iwase
The term “AI” seems to have taken on a life of its own. What are your thoughts on this? Additionally, I believe that education for human resource development is a significant challenge that will profoundly impact the future of society. How do you view this issue?
Dr. Kaminuma
It’s not just AI—there isn’t an established academic framework for informatics, like there is for physics. This seems to be related to challenges in education and human resource development. The unconventional physicist Richard Feynman once delivered a lecture on the theme of “computation,” which was later published as a book6). It’s a fascinating book that connects to quantum computing, but in it, he says something along the lines of, “Programming and computational techniques are not science but art.”

In the United States, there are departments or faculties of Computer Science, where Ph.D. degrees are awarded. While the situation may be different now, in Japan, creating an excellent program alone was often insufficient to earn a doctorate. This may be one of the factors contributing to the shortage of talent in this field in Japan.

Advancements in Medicine, Health Science, and Genomic Decoding

Mr. Iwase
There’s a gray zone between illness and health. Will this gray area narrow, or will advancements in medical science lead to rapid progress in health maintenance science? Everyone is curious about how IT technologies will be involved in this domain and what the future holds. What are your thoughts on this?
Dr. Kaminuma
The concepts of “health” and “illness” are not scientific but rather convenient constructs. During the 1980s, when biology was in the spotlight, Dr. Ken Ohno of the City of Hope, a renowned Japanese researcher based in the U.S., discussed genes that humans could modify. His proposals have become realistic issues with advancements in genome editing technology. It’s often said that pharmaceutical companies “create diseases,” as defining diseases allows for the development of new drugs7). From a data perspective, this may relate to biomarker discovery. While individual indicators are easier for doctors to understand, in reality, composite indicators are sometimes necessary.

I’ve proposed the concept of “Health Metrics.” Intervention at the early warning stage is the most effective, as outlined in a roadmap by former NIH Director Elias Zerhouni. While I advocate for interventions at the tertiary prevention stage, there’s growing recognition in the U.S. of the importance of maintaining health through lifestyle innovations in next-generation healthcare8). The key question is whether aging should be considered a disease to be treated. Recently, there has even been research using AI to measure healthy life expectancy9).

Mr. Iwase
Nineteen years have passed since the declaration of the completion of the Human Genome Project in 2001, and an enormous amount of genetic data now exists. Do you think this data is being effectively utilized in drug discovery research?
Dr. Kaminuma
I’m not sure if my perspective is accurate, but about a decade ago, there was widespread discussion about whether the Human Genome Project, which concluded successfully over a decade prior, had significantly contributed to medicine or drug development. Francis Collins, who led the international team for the Human Genome Project at NIH, became the NIH Director in 2011. Despite budgetary constraints, he established NCATS (National Center for Advancing Translational Sciences) under NIH, appointing his former subordinate, Christopher Austin, as its director. NCATS not only promoted collaborative projects with big pharmaceutical companies but also launched the BD2K (Big Data to Knowledge) initiative, which aims to extract knowledge from big data. Additionally, NIH created a position for a leader (P. E. Bourne) to oversee digital initiatives, such as AI and data science. Collins also advocated for Precision Medicine, initially targeting cancer (Precision Oncology) and later focusing on PGx (Pharmacogenomics) to promote the proper use of medication10). This reflects his determination to fulfill the promises he made as a leader of the genome project.

Transforming genetic data into breakthrough drugs is still in its infancy and will take time, but I believe progress is steady.

Bio-Pharmaceutical Development and Strategic Data Science

Mr. Iwase
It seems inevitable that the proportion of bio-pharmaceuticals within the pharmaceutical industry will expand and develop further in the future. I’ve also heard that the use of biobanks is evolving across countries. How do you think data science will contribute to bio-drug discovery, and what are the challenging starting points?
Dr. Kaminuma
I don’t know the details, but reading about the success of CAR-T therapy11) gives me hope. However, Japan seems to be significantly lagging behind. While Japan appears to have a strong tradition of excellent immunological research, the issue seems to lie in translating that research into drug development.

This is a very personal impression, but based on my experience since the 1970s working on immunology (such as hepatitis B and complement systems) from a computational application standpoint, I’ve found immunology to be extremely challenging to handle from a systems (engineering) perspective. This difficulty arises because the subjects—cells and molecules (like complements)—continuously change their states. I believe this issue is also relevant to regenerative medicine. It relates to the challenge of “cell fate and control,” which suggests that computational applications require some innovative approaches. However, this very complexity makes it an appealing and worthwhile area to tackle from the perspective of computational and information techniques.

That said, addressing this challenge requires teams composed of biomedical experts and younger generations specializing in computational and information techniques. One example, based on online information, is the European (German-French) LifeTime Initiative12).

In any case, cancer, immunology, and research, as well as clinical applications, appear to be advancing at an accelerated pace. However, I feel that adequate frameworks to fully process and utilize the data generated are not yet in place. Perhaps it’s necessary to start by focusing on specific targets, such as cancer or autoimmune diseases, and systematically collecting clinical data related to them.

Strategic Use of Public Funding

Mr. Iwase
In Japan, it seems that there is no unified plan across sectors for activities like disease-based biobanking and cohort studies involving sample collection. How do you think Japan can catch up with the rest of the world?
Dr. Kaminuma
Many of Japan’s research and development plans lack strategic depth and often prioritize visually appealing, simplified ideas. At some point, it became mandatory to include illustrative “concept diagrams” in grant applications for scientific research funding. For example, even in robotics research, “human-like qualities” tend to be highly valued. I call this phenomenon the “Astro Boy Syndrome.”

In AI, I believe there should be increased funding for socially critical issues, such as nuclear decommissioning, fraud prevention (e.g., “It’s me” scams), and understanding and addressing dementia. Furthermore, I believe the government should support the development of cancer treatment drugs as a response to the aftereffects of nuclear exposure. Innovation doesn’t emerge from merely aiming for it; it arises from tackling the challenges that need to be addressed.

Regarding biobanks and cohort studies, the 10th issue of *Drug Discovery Plaza* discusses baseline studies. What’s currently needed is a numerical understanding of normal states relative to disease states. Additionally, there needs to be more recognition of the importance of clinical record management techniques and databases that can generate disease-specific knowledge from real-world data (RWD).

For instance, the U.S. company Flatiron Health was acquired by a major pharmaceutical firm for a substantial amount, likely due to the value of its techniques for recording clinical (support) data. Analyzing such data is expected to yield useful insights into cancer care. A similar approach could be applied to clinical records and database construction for autoimmune diseases like rheumatoid arthritis. These can be considered “heuristic databases.” The same principle applies to biobanks.

Developing such systems in Japan is extremely challenging, but analyzing why this is the case could reveal new areas for innovation.

Mr. Iwase
Japan’s health insurance system seems quite unique compared to those in other countries. Do you think this system could be leveraged to contribute to the advancement of domestic drug development?
Dr. Kaminuma
That’s an excellent idea. Japan’s insurance system operates on the assumption of inherent goodwill. The internet of around 1994 was also supported by volunteers operating on similar goodwill principles. Both of these systems now seem to be faltering, which is why I believe we need to explore new models.

Perhaps self-serving, but the nonprofit organization we established, the Institute for Cyber Bond (ICA), aims to address these two changes. Specifically, we are promoting “participatory healthcare.” In simple terms, this means encouraging patients and citizens to make smart use of existing services while taking active steps to manage their health through measures like exercise, diet, sleep, and environmental adjustments without necessarily relying on prescriptions.

In Japan, there is still a prevailing ethos of “patients should just follow instructions,” reflecting a paternalistic approach among doctors. While such paternalism has also been criticized in other developed nations, there is growing recognition in the U.S. and Europe of the importance of patient-centric care. In Japan, companies like Takeda Pharmaceutical have started adopting a “Patient First” approach. In contrast, we advocate for empowering citizens and patients to take proactive roles in maintaining their health and addressing illnesses.

Lifestyle diseases are a prime example where ensuring behavior changes among citizens and patients is crucial. Practical methods to achieve these changes are at the heart of what we should be implementing.

The Challenges of Developing Data Science Experts

Mr. Iwase
The application of data science is not limited to bio-related fields but is also increasingly relevant to small-molecule compounds. Are there any success stories from an informatics perspective?
Dr. Kaminuma
I was involved in founding what is now the CBI Society around 1980 and continued to engage in its operations until 2010. However, I am not certain how much of an impact this activity has had in the field of drug discovery. At that time, pharmaceutical companies were highly secretive. Nonetheless, a recent success story has been reported in detail by a group from Taiho Pharmaceutical involved in the development of Lonsurf13). It seems that data science tools are now recognized as indispensable at various stages of drug development rather than being tied to the development of specific drugs.
Mr. Iwase
Finally, could you share your thoughts on how Japan should approach the development of data scientists in the future?
Figure 2: Diagram of relationships between academic fields. Artificial intelligence is a study that corresponds thought processes to calculations (algorithms). Its foundation lies in mathematical or computational techniques common across various academic disciplines. The theories and computational techniques in the diagram share common concepts. This relationship suggests that pattern recognition, artificial intelligence, machine learning, and D2K Science are deeply connected to existing fields such as mathematics, natural sciences, and engineering (including computer hardware) but are likely to evolve into independent academic domains.


Figure 2: Diagram of relationships between academic fields. Artificial intelligence is a study that corresponds thought processes to calculations (algorithms). Its foundation lies in mathematical or computational techniques common across various academic disciplines. The theories and computational techniques in the diagram share common concepts. This relationship suggests that pattern recognition, artificial intelligence, machine learning, and D2K Science are deeply connected to existing fields such as mathematics, natural sciences, and engineering (including computer hardware) but are likely to evolve into independent academic domains.

Dr. Kaminuma
I find training human resources for ICT utilization extremely challenging. The fundamental issue lies in a lack of understanding of specialization. Informatics itself has not been systematically organized, and there is no comprehensive overview of how techniques that support or replace human thinking and cognitive functions relate to other fields such as natural sciences, engineering, and medicine.

I foresee that current data science and AI will evolve into a new academic domain, independent of mathematics, natural sciences, and engineering, which could be called “quantitative thought studies.” This field will include applications in healthcare and drug discovery.

One of the core challenges in understanding informatics or computational techniques, including AI and machine learning, is the rapid pace of technological advancements and the increasing breadth and depth of their applications. Moreover, acquiring these skills tends to favor younger generations while being more challenging for older individuals, akin to gymnastics or swimming, where early involvement offers significant advantages. Consequently, a gap widens between developers and users of data science technologies in fields requiring years of expertise.

This generational gap raises issues of how to harmonize the roles, job functions, and career advancement of younger professionals. Their work often does not align with traditional evaluation criteria for researchers in science and engineering. This misalignment is because platforms for presenting results, such as conferences, journals, and research communities, have not kept pace with their work. It is not uncommon for data scientists to be appreciated for their contributions but not adequately compensated or recognized in terms of position or salary.

In the private sector, hierarchical structures like subcontracting often result in poor recognition and treatment of IT specialists who perform the actual work. While this issue resembles the treatment of skilled experimental technicians, it differs in that computational techniques directly relate to human cognitive tasks like reasoning and decision-making. These techniques may lead to “creative destruction,” altering the design of work or research itself.

Given the rapid pace of ICT advancement, professionals focusing on computational techniques need dedicated environments to specialize. Becoming an expert takes time, and young professionals should ideally develop their skills by tackling real-world problems. Creating an environment that facilitates this approach is crucial.

In Japan, particularly in healthcare, what is lacking are strategies and the logistics mindset. From a strategic perspective, NIH declared its commitment to transforming into a data science-based organization14), with “continuous knowledge generation and utilization” as its goal. Data science is inextricably linked to knowledge generation. In the U.S., the National Library of Medicine (NLM), whose director in the 1980s, Donald Lindberg, was both a physician and an AI researcher, has become a foundational institution for genomic medicine15). Japan lacks such a foundational institution. Additionally, the application of AI techniques requires a glossary of technical terms. Japan has few researchers in natural language processing related to healthcare, compounded by challenges with the limited user base of the Japanese language.

To promote data science in Japan, a robust support infrastructure is essential. Training human resources must also be viewed from this perspective. In drug discovery, as NCATS suggests, it’s vital to restructure workflows, particularly pipelines, in pharmaceutical companies and reassess each component from a data science perspective16).

Mr. Iwase
I now understand the significant hurdles Japan faces in data science and its professional training, especially given the complexities of bioscience and healthcare. Having worked in the analytical instrument and bioscience support markets, I recognize that while Japan’s bioinformaticians and IT specialists are highly skilled in specific fields, future education must foster a broader perspective on market transformation and diverse content. We may need to revisit the fundamental structure of our education system. While IT is driving rapid societal transformation, understanding past developments through archives is crucial for gaining a comprehensive view and envisioning the future.

Thank you very much for sharing your valuable insights today.


References

  1. Satoshi Watanabe, Knowing & Guessing, John Wiley & Sons, 1969 (Japanese translation: Yoichiro Murakami and Nobuharu Tanji, *Knowledge and Guessing: Scientific Epistemology*, 4 volumes, Tokyo Tosho, 1975); Satoshi Watanabe, *Recognition and Pattern*, Iwanami Shinsho, 1978.
  2. Tsuguchika Kaminuma, *Medical Innovation and Computers*, Iwanami Shinsho, 1985.
  3. Tsuguchika Kaminuma and Shusuke Kurashina (translators), *Computer-Based Medical Consultations: MYCIN* (original by E. H. Shortliffe, Elsevier, 1976), Bunkodo, 1981.
  4. AMIA: American Medical Informatics Association
  5. C. A. Kulikowski, E. H. Shortliffe et al., *AMIA Board white paper: Definition of biomedical informatics and specification of core competencies for graduate education in the discipline*, J Am Med Inform Assoc (2012). doi:10.1136/amiajnl-2012-001053
  6. R. Feynman, *Feynman Lectures on Computation*, Addison-Wesley, 1996.
  7. Translated by Tsuguchika Kaminuma / Supervised by Yukio Tada and Masashi Horiuchi, *The Truth About Drug Discovery: From Clinical to Investment* (Revised Edition, Nikkei BP, 2014): Original by Bartfai T and Lees GV (2006), *Drug Discovery: From Bedside to Wall Street*, Elsevier/Academic Press: Amsterdam.
  8. Tomomi Usui, *The State of U.S. Healthcare: Introduction of Lifestyle Intervention Education into Medical Training*, Biofeedback Research, 45(1): 19-23, 2018.
  9. A. Zhavoronkov and P. Mamoshina, *Deep Aging Clocks: The Emergence of AI-Based Biomarkers of Aging and Longevity*, Trends in Pharmacological Sciences, 40(8): 546-549, 2019.
  10. F. S. Collins and H. Varmus, *A New Initiative on Precision Medicine*, The New England Journal of Medicine, 372(9): 793-795, 2015.
  11. CAR-T Therapy: A treatment that extracts the patient’s own T-cells, modifies them, and reinfuses them back into the patient.
  12. European LifeTime Initiative (https://lifetime-fetflagship.eu/).
  13. Yukio Tada, *The Story of Lonsurf Development*, CBI Journal, 7:13-39, 2019.
  14. *NIH Strategic Plan for Data Science* (https://datascience.nih.gov/sites/default/files/NIH_Strategic_Plan_for_Data_Science_Final_508.pdf); C. Mura, E. J. Draizen, and P. E. Bourne, *Structural biology meets data science: Does anything change?* Current Opinion in Structural Biology, October 2018.
  15. D. A. B. Lindberg, MD, *The National Library of Medicine, 1984*; NCATS *Drug Discovery, Development and Deployment Maps* (https://ncats.nih.gov/translation/maps).

Tsuguchika Kaminuma

Tsuguchika Kaminuma

Born in 1940 in Kanagawa Prefecture. Educated at International Christian University, Yale University, and the University of Hawaii, he earned a Ph.D. in physics. From 1971, he worked at Hitachi Information Systems Laboratory, Tokyo Metropolitan Institute of Medical Science, and the National Institute of Health Sciences. His research spanned pattern recognition, medical artificial intelligence, medical information systems, bioinformatics, and chemical safety. In 1981, he founded an industry-academia-government research exchange organization (now known as the CBI Society) aiming for theoretical drug design. Later, he was involved in interdisciplinary human resource training at Hiroshima University and Tokyo Medical and Dental University. In 2011, he established the Cyber Bond Institute NPO.

Hisashi Iwase

Hisashi Iwase

Advisor for Life Science Innovation at the Japan Analytical Instruments Manufacturers’ Association (JAIMA), and President & CEO of BioDiscovery, Inc. Born in 1951 in Tokyo, he graduated from the Department of Industrial Chemistry, College of Science and Technology, Nihon University. He gained extensive experience in management and marketing of analytical and bioscience instruments at Merck Japan, Waters Japan, Millipore Japan, PerSeptive Biosystems Japan, Applied Biosystems, Varian Technologies, and Agilent Technologies. In 2001, he established BioDiscovery, Inc. Since 2013, he has also served as an advisor for Life Science Innovation at JAIMA.