The increasing growth of the biotech industry in recent decades has been fueled by expectations that their technology may revolutionize pharmaceutic research and unleash an influx of money-making new drugs. But with the sector’s industry intended for intellectual premises fueling the proliferation of start-up companies, and large medication companies more and more relying on partnerships and aide with tiny firms to fill out their pipelines, a serious question is definitely emerging: Can the industry survive as it advances?

Biotechnology has a wide range of fields, from the cloning of DNA to the development of complex prescription drugs that manipulate cellular material and natural molecules. Many of these technologies happen to be really complicated and risky to get to market. But that hasn’t stopped 1000s of start-ups from being established and getting billions of us dollars in capital from buyers.

Many of the most guaranteeing ideas are coming from universities, which certificate technologies to young biotech firms as a swap for equity stakes. These types of start-ups therefore move on to develop and test them, often by using university laboratories. In many instances, the founders of them young businesses are professors (many of them world-renowned scientists) who created the technology they’re employing in their startup companies.

But while the biotech program may produce a vehicle for the purpose of generating development, it also makes islands of expertise that stop the sharing and learning of critical know-how. And the system’s insistence about monetizing patent rights above short time times doesn’t allow a good to learn out of experience when that progresses through the long R&D process necessary to make a breakthrough.