Genetics and Type 2 Diabetes

Diabetes-with-family-history-prevalence-city-20592 When I first diagnosed with Type 2 Diabetes (TD2) I immediately started to research the scientific literature for any clear genetic explanation of this highly prevalent disease, but I was unable to find any clear link between a specific genetic loci and occurrence of the disease; although, there were several papers making such links they were far from proving any such link.  However, a recent article in Scientific American (October 2015, pp56-59) has suggested one possible explanation of the growth of TD2 and a genetic cause that predates the evolution of Homo Sapiens!  Reflecting on this article I can understand how I would not have come across this explanation as the research has always been linked to a different disorder – Gout, or the “disease of Kings”.

Gout is caused by a build up of uric acid in the bloodstream, which can then crystallise in capillary vessels leading to immense pain.  Uric acid is swiftly removed from most animals through breakdown by an enzyme called Uricase, but humans and many primates lack a functional form of the gene responsible for production of this enzyme.  Apparently, the loss of function of this gene occurred some 15+ million years ago when a series of nonsense mutations inactivated the gene (Oda et al. Mol Biol Evol 2002;19:640–53).  The article proposes that the selective pressure for the loss of Uricase activity begins when apes moved from Africa to Europe, which at first provided a plentiful environment with a sub-tropical climate providing bountiful supplies of fruit for their diet (particularly figs).

All_palaeotemps

However, this period saw the beginning of climate cooling and this drier cooler weather changed the European vegetation from a rich broadleaf forested area toward a savanna-like environment, with much less fruit available and much of this fruit (especially figs)now becoming seasonal and quite scarce during winter.  As cooling continued these European apes began to starve, therefore, the loss of the Uricase gene must have provided a selective advantage (Hayashi et al. Cell Biochem Biophys 2000;32:123–9).  The normal mammalian reaction to periods of starvation is to produce fat (e.g. for an energy supply during hibernation, or to provide sufficient energy to survive winters).  However, during prolonged periods of starvation foraging for food must continue, especially for primates that do not hibernate, and for this to be successful glucose is required by the brain.  This is achieved by an “insulin-resistance” effect.   The clue to this selective advantage lies with the fruit-rich diet that the apes in both Europe and Africa were consuming – digestion of fructose leads to production of uric acid and researchers have found that uric acid can trigger this switch to “insulin-resistance”.insulin-resistance The researchers proposal is that the loss of the Uricase gene led to a gradual development of the ability to switch to converting fructose to fat providing a better chance to survive food shortages during winter.  They also propose that these European apes may have brought this major selective advantage back to Africa as they migrated back to avoid cooling winters, they must have out-competed African apes and thus left the mutated Uricase gene that has been acquired by humans.

If this explanation of these genetic events is correct, we have a genetic explanation of TD2 – sometimes known as insulin-resistance – and what we have now is that processed foods, which often contain corn syrup, or table sugar, that are extremely rich in fructose, are being turned into fat because of the elevated uric acid levels in our bloodstream.  It would be exciting to think that new drugs could be developed against uric acid production, which might help reduce obesity and TD2.  Genetic Engineering may even hold the possibility of restoring Uricase production in the distant future.  In the meantime, as I have said before we must aim to increase regular exercise, reduce sugar intake and aim to make fresh fruit our only supply of fructose.  The antioxidants available in fresh fruit help to reduce many side effects of excess uric acid and reduce multiple diseases.

However, from a personal viewpoint I am left with something of a mystery as this genetic explanation does not explain familial occurrences of TD2, something I have personal experience of!  The best link between TD2 occurrences in families and an observed disorder is that TD2 is tightly linked to β-cell dysfunction in the pancreas (O’Rahilly, S.P. et al.  The Lancet , Volume 328 , Issue 8503 , 360 – 364), which is associated with insulin resistance (Kahn, 2003.  Diabetologia 46, 3-19), but the nature of this genetic link is complex and confused and involves amyloidosis of insulin.  A detailed description of this will follow.

Science, Discovery, Drugs and The Pharma

pharmaHaving spent all of my working life in University, I am very aware that my view of The Pharma industry is biased and probably poorly informed. However, I believe that there always should be “space” for an outsider’s viewpoint and scope for questions from an ex-researcher about how to approach scientific discovery. To some extent my views have been influenced by the recent takeover attempt by Pfizer of Astra-Zeneca (a company with whom I have had some contact), but my ideas have also been formed around “stories” and discussions with people who have worked in a wide variety of companies, small and large, involved in drug development.

Drug development cartoonDrug development is a very slow process and can take decades from “potential” to production, but the rate of development of new drugs has dropped in a very significant way over recent years. This is an unexpected situation for a research area that has also contributed many new techniques. New chemistry methodologies have progressed through novel, rapid techniques such as combinational chemistry, in which many different steps, in a variety of chemical reactions, occur simultaneously over an array of substrates — through recombinant DNA techniques that allow the artificial construction of DNA sequences that produce novel products, to single molecule biochemistry that has developed over recent years (and in which I was slightly involved). Despite all of this new technology, the number of new, successful drugs reaching the marketplace continues to fall and in some cases this situation has grown critical (e.g. new antibiotics). The concept of personalized medicine, built around the knowledge of an individual’s genetic background, from the human genome sequencing project and cheap DNA sequencing is now available, but has also failed to drive the development of new drugs.

All of this must be set into the research background that leads to drug discovery and this quickly leads to the question “are the large Pharma companies the best system for unique research that might lead to drug discovery”? The views of the current President of the Royal Society, Paul Nurse, as expressed in a recent issue of The Observer (18th May, 2014) suggest that financial interests of the share holders far outweigh the “zest” for funding of research science and the Pfizer takeover bid for Astra-Zeneca provides a classic example of such a situation. Such worries are compounded by situations where, following a takeover, a company might suppress certain areas of research, or discoveries, in order to enable expansion, or sale, of their own products — I have heard of several such situations across a number of companies and it is unclear exactly how much “buried” science there is.

It has been a long-standing view, in the UK, amongst research-active academics, that “blue-sky research” should be independent of industrial, or profit-orientated funding and that such research is more likely to lead to important discoveries (including drugs) rather than profit-driven research that is often too focused on developments related to existing science policy within a company. However, there are many examples of the large Pharma funding research projects within universities and some of these projects have “blue sky” elements to the research direction, but my own feeling is that there are not enough of these types of projects (at least in the UK there are not), that they are difficult to set up and that the companies often try to influence the direction of the research for reasons that are not connected to the science.

Perhaps the best example is illustrated by the development of antibiotics (or more correctly by the lack of recent development of antibiotics). Although, during the 1950s, the widespread use of antibiotics had a major impact on health by reducing bacterial infection, reducing the risks from many pathogens to only a few cases, their use in large quantities for the treatment of animals led to the spread of surplus antibiotic in the environment and, subsequently,

to the development of antibiotic resistance in many soil bacteria (natural variation in populations of bacteria include some spontaneously resistant variants, which are selected from the population and allowed to grow in the presence of the antibiotic). Unfortunately, this resistance was able to spread quickly amongst a wide range of bacteria, Including many responsible for human ailments (bacteria often carry small, independent pieces of DNA called plasmids that can readily transfer from bacterial strain to bacterial strain and these plasmids were found to also carry the genes for antibiotic resistance). When this situation was combined with a habit amongst patients of not taking the full dose (prescription) of an antibiotic, which can lead to antibiotic resistant bacteria within a population, there is a real problem with regard to treatment of infection. However, antibiotics are not a powerful “earner” within the drug development market — being only prescribed for one week, or a limited time period, which, when compared to “lifestyle drugs” such a beta-blockers that must be taken for the rest of the patient’s life — this weakens the development of antibiotics by the Pharma. In fact, the Pharma and drug companies in general have shown little interest in developing new antibiotics despite the growing problem of resistant strains of infectious bacteria and particularly MRSA infections in hospitals. This is undoubtedly the best evidence that the Pharma are not necessarily guided by good science, but rather by profit for shareholders.

Another problem associated with the Pharma is with regard to the release of information about the drugs these companies sell. As reported in the June 2014 edition of Scientific American, even the doctors that prescribe the drugs rarely have information about side-effects (other than the sanitized details released with the packaging). However, this highly protective, negative attitude toward the release of scientific information (which is an antithesis to most researcher workers) may be about to change as the drug companies are forced to accept that drug trials, which are often funded by public money, should be made available in the public domain. European Law already requires this form of release and now large Pharma, such as Pfizer, have set up Web portals to release such data files. Negative information can often be the most important and yet rarely gets published, so this type of release and sharing of information can only be for the good. The Pharma should be required to always share this data as part of a more open approach to better collaboration in science, but only governments can ensure this happens.

So, what does the future hold? Well, I found it very interesting that the government was willing to try to become involved with the Pfizer take over bid, even though this was not in support of the science and it is important to remember that all of the Pharma have developed drugs and other products on the back of government-funded science (usually at universities, but also through small spin-off companies). Why do we not, as a country, take the next logical step in this process and forge much tighter links between universities and the Pharma — not in the usual way of depending upon academics to do this, but at a national level with government funding and tax incentives — and look to drive development of commercially relevant science through blue sky projects in areas that can be more reactive to actual needs (most researchers are the first to react to problem situations as they often write research papers explaining the problem). It will not cost the government any more than it currently spends, but by forcing the Pharma into tighter interactions and promoting such activities we could build a better research base. The government could then better control the output of the industry and request suitable drugs (and antibiotics) from this research. Such a system would undoubtedly have unexpected “spin-offs”, especially in the area of technological developments, which would greatly spur this type of research. Patents would still protect discoveries and profits will still be made as before.

Ah well, I can dream!

GM Plants with a different storyline

chestnutAnother Scientific American article (William Powell, March 2014, pp54-57) inspired me to write this post, which details a story of genetic engineering a plant, but not a crop, but a tree!  First, a quick background of why….

Apparently, in 1876 an unfortunate situation developed following importation of chestnut seeds from Japan, it turned out that these seeds were contaminated with spores from a fungus (Cryphonectria parasitica) to which American Chestnuts were highly sensitive, but the Japanese chestnut1Chestnuts were immune.  This fungus effectively strangles the tree through growth of its mycelial fans, which produce oxalic acid that destroys the bark of the tree while allowing growth of the fungus.  It is this dead wood, produced by the action of the oxalic acid, that leads to strangulation of the tree as it tightens its grip on the trunk of the tree.  only 50 years after the initial import of this deadly fungus more than 3 billion trees were dead!

A programme of research was initiated to produce hybrid trees by crossing Chinese variants, which are also resistant to the fungus, with American trees to produce a hardy hybrid, but this work will take many years.  Therefore, in parallel a project was initiated to make use of, what at the time was a novel approach, genetic engineering of the plant.  As is often the case in science this idea was built around a fortunate coincidence in which a group had isolated a wheat gene for oxalate oxidase, and introduced this gene into other plants using a well described engineering system called Agrobacterium.  This enzyme was, of course, ideal for the proposed project as it breaks down oxalic acid the primary cause of the blight.  In addition, they had available genes that produce antimicrobial peptides (AMPs) that disrupt C. parasitica infection and, as time passed, genome sequencing projects have pointed to the genes in Chinese Chestnut trees that are responsible for resistance to the fungus.  The future looks promising for genetically engineering the tree instead of depending upon hybrids.

Ti PlasmidThe use of the soil bacterium Agrobacterium tumefaciens is an interesting story in itself and a subject I enjoyed teaching about as a perfect example of using a natural system for advanced technology.  The odd thing about this bacterium is that is has the ability to infect plants with its own DNA that makes the plant produce unusual amino acids, which it cannot synthesise itself.  the result of this infection of foreign DNA is that the plant develops small tumours, but the bacterium benefits from the availability of these biomolecules.  Genetic engineers were able to manipulate this system so that they could insert “foreign” DNA into the bacterial plasmid, in place of the tumour-forming components, and enable the bacterium to transfer this foreign DNA into a wide variety of plants in a stable a predictable manner.  Eventually, the research group were able to develop the mechanisms for tissue culture of the genetically altered plant cells and a model system based on poplar trees was available to initiate the experimental approach to overcoming the blight infection.

There are now more than 1,000 transgenic Chestnut trees growing in field sites, public acceptance of this approach to restoring a small piece of biodiversity is good and the future holds a promising approach for further such experimentation.  My own view is that this is a piece of genetic engineering that all sounds very good and very promising for the future.  My only caution, also expressed by the researchers is the spread of the genetically modified seeds, which may help remaining trees recover from infection, may also lead to cross pollination with closely related plants.  However, there are few trees closely related to the American Chestnut, so this seems unlikely.   A good story that supports genetic engineering in plants!

 

A model organism – the virtual bacteria.

stanfordmycoplasmagenitaliumI was reading an article in Scientific American today that got me thinking about the complexities of biology – the article described the production of a virtual bacteria using computing to model all of the known functions of a simple single cell.  The article was a very compelling read and presented a rational argument of how this could be achieved, based on modelling of a single bacterium that would eventually divide.  The benefits of a successful approach to achieving this situation are immense for both healthcare and drug development, but the difficulties are equally immense.

In order to simplify the problem the organism chosen to be modelled is a bacterium with the smallest genome size – Mycoplasma genitalium a bacteria that has only 500+ genes all of which have been sequenced.  This bacterium is also medically important, which adds weight to the usefulness of a virtual organism.  The problem of programming the computer was divided into modules, each of which will describe a key process of the cell all of which could feedback in a way that described actual binding coefficients for protein-protein and protein-DNA interactions.

As I read the article, I began to realise that there were some simple problems associated with the description of how the computer would “manage” the cell and when the author described doubling protein concentrations prior to cell division I knew there were problems with their model – this simplistic approach is not what happens in the cell – cellular control is an important aspect of this modelling and must be correct if the cell is to be realistic.  I can illustrate what I mean with one example – plasmid replication.  A plasmid in an autonomous circle of DNA, often found in bacteria, that are easy to understand and ubiquitous in nature.

Replication of plasmid DNA:

ReplicationThe number of copies of a plasmid are tightly controlled in a bacterial cell and this control is usually governed by an upper limit to an encoded RNA or protein, when the concentration of this product drops, such as AFTER division of the cell, replication will occur and the number of plasmids will increase until the correct plasmid number is attained (indicated by the concentration of the controlling gene product).

This is a classic example of cellular control and is a lot different to a model based on doubling protein concentration in anticipation of cellular division.

Mycoplasma genetics and the complexity of biology:

This whole problem also got me thinking about my own subject area and my brief excursion into the study of restriction enzymes from Mycoplasma. Restriction systems are a means for bacteria to protect themselves from viral invasion and, despite the small size of the genomes of Mycoplasma bacterial strains, they encode such systems.  There is clear evidence that such systems are “selfish” and may be they are fundamental to the long term survival of bacteria, so I think they need to be dealt with in the model organism.  However, things begin to get complicated when you look at the well described system from Mycoplasma pulmonis (a slightly more complex version of the organism used for the model).  Instead of a single set of genes for a restriction system, as usually found in other organism, the restriction genes of Mycoplasma pulmonis  are capable of switching in a way that can generate four different proteins from a single gene.  This is where the complexity of modelling an organism occurs and while the organism used may have a simple genome, it is important to know of how even simple organisms can increase their genetic complexity without increasing their DNA content.

Conclusion:

I think the work at Stanford is both interesting and important and I think they have achieved a very important step along the road to modelling a living cell, but I also think they may need more information and have more complex modules available to them as they try to be more accurate with even these simple organisms.  It will be a long road before we have a model of a human cell, but what an incredible thought that would be!

 

Who owns science

scienceYou might think that this is a simple question with a simple answer, but the truth is far from simple and this subject led to a very lengthy debate the last time I raised it with my colleagues.   However, before we get to the subject a simple definition or two are needed:

  • Science – what I mean by science is “experimental-based discovery science” of the type that is carried out in Universities.  I do not mean industry-funded science, or research that involves review of a subject.  Pure research, carried out for the sake of interest is often known as “blue-skies” research, but it can often lead to unexpected commercial outcomes.
  • Ownership of science – by own I really mean how accessible is scientific information as it is access to the science that defines ownership.  I hope this will become more clear as I develop this blog!

I guess before we get to ownership of science it is important to first explain how research is funded in the UK and how it is carried out in Universities:

Research Funding:

Science-funding-graphic-007

There are a large number of funding sources available both in the UK and across Europe, of which the research councils and European research grants are the largest funders, but significant funding also comes from charities and from private sources.  All of these types of funding are competitive and awarded to individual scientists, or groups of scientists who collaborate toward an overall goal.

Universities also receive direct funding of research from government, through the research councils, in the form of infrastructure awards (often based on how many research grants were awarded, but also on measured success of individual researchers).  Sometimes, this funding is targeted at commercially-orientated research and sometimes at “blue-skies” research.  In addition, there are various sources of infrastructure funding, to which universities can bid in a competitive way, in order to establish equipment or resources for research.

Finally, individual researchers may have access to funds that allow small research projects to be initiated, that are either university-based or belong to the individual within the university’s research framework (overheads and slush funds).

Establishing a research project:

Any full-time employee at a university can apply for a research grant and carry out research; although, to get a competitive grant the individual usually needs an established research profile.  However, it is often a surprise to those outside of the university system that carrying out research is not a contractual requirement for a university employee, but simply something that is often expected or desired by the employer.  So, academics do not have to apply for research grants and are not forced to do so – research usually springs from their own interests – and many academics only carry out teaching duties.

Those that want to engage in a research project have two ways to start:

  1. Join an existing research group and follow their own path within that research group.
  2. Establish a new research group, seek external funding and hope to gain sufficient expertise to follow the first funding with further funding – often a difficult pathway.

Researcher%20reading%20books%20about%20Biology%20The%20Arts%20MIt is generally accepted that the chances of obtaining funding from most sources is at best 1:5, so it may well take five applications to get one grant, but sometimes this process also means changing the details in the grant application and also looking to a different funding sources – becoming an established researcher is not easy and may require many hours of reading and writing!  Sometimes support comes from the university in the way of PhD studentships, which lend a pair of hands to the process of obtaining enough results to add weight to an initial application.  In addition, some funding sources include grants aimed at new researchers 9often young scientists at the beginning of their career).

Measuring Research Ability:

In this modern era, where every work-based activity is monitored for efficiency, science is no different and grants are only awarded to researchers who have a strong rating in what is known as the Research Excellence Framework (REF – previously known as, the Research Assessment Exercise or RAE).  The award of externally-funded grants is a major part of this exercise, but the other major measurement of research excellence is publishing in peer-reviewed journals and this now brings us to our main subject as this is the first measure of access to science.  Without easy access to published research it is impossible to write a successful research grant proposal.

research-impact-cartoon

It should be clear from what I have already said that access to published science is the start point for writing any grant application.  Strangely, despite the fact that the researcher carries out the research, he does not necessary have access to even his own published work.  This is a quirk from using publishing houses to print and distribute published science, but is also a trap created by the REF exercise where a main requirement is to  publish in high-impact journals to improve the REF-rating.  However, these high-impact journals are usually owned by the major publishing houses and the general method for publication means that copyright lies with the publishing houses.  This problem of access to published science is compounded by the fact that the publishing houses restrict access to published papers unless you subscribe, in one form or another, to the journal!  Recently, there has been a strong movement amongst scientists to change the way science is published, but this is still a problem area.  Some grants include sufficient funds to pay for “open-access” research papers, but many do not.  In fact, a good illustration of this problem is how difficult it is for the general public to access published science – without a library subscription to a number of journals, the cost would be prohibitive.

In summary, even the best scientists do not have immediate access to their own published work, at best they depend on their university library to purchase journals that enable such access and as such they do NOT own their own research!  If they have used grant funds to publish in an open access journal then they will be able to read and access that paper, along with anyone else in the world, and therefore they will have “bought” ownership of their research.

Invention, Patents and Ownership:Law

Of course, publishing research papers is only one aspect of research, but it is the primary means by which research is disseminated and as a consequence a very important aspect of science.  However, some research leads to invention and under the British patent system that is the primary mechanism for obtaining a patent.  A patent allows release of details of the science, publication to the general public, but uses the “strong arm of the law” to prevent the work being copied, allowing the inventor the right to commercial exploitation.  If the research looks as though it may lead to commercialisation then a patent is a very important aspect for protecting the work and the ideas.  Often a university will pay the costs of patent applications, but who owns the patent and who owns the research that led to it?  There is no doubt that the researcher’s input is an absolute requirement for the invention, but university employees (in the UK) are subject to a clause in their contract that states that any invention, arising from research at the university, will belong to the university!  One argument that is used to support this situation is that without the university’s infrastructure (especially high cost equipment) the invention might not be possible.  So, a patent may have the inventor’s name on the front, but ownership is the university’s.

sif_chemists_28feb06So, who really owns science?  Well, it doesn’t really seem that it is the researcher (in many cases they cannot even guarantee that they can read their own papers); although, there are often benefits that come the way of the researcher that make the work worthwhile (such as reduced teaching loads), but overall science is a hobby at work and the real benefits are simply your name on papers, grants and patents.  this may be best illustrated by DNA sequencing, which is named after the scientist who developed the technique – Sanger sequencing – and despite his recent sad death, Fred Sanger’s name will live on because of this.  Very few scientists make money from their research, but it can be fun!  However, there is a final aspect of all of this that should be the main thought of any budding scientist – RESEARCH IS OWNED BY SOCIETY – the benefits that come from research are unpredictable and varied, but the technological progress of recent years is one example of the benefits of good science and the description of the research should be freely available to everyone.

 

Proteins, Peptides and Amyloids – Alzheimer’s Disease

if you have read my recent science blogs you will be aware that I have an interest in Alzheimer’s Disease based on work involving protein aggregation.  A recent article by Bhattacharjee and Bhattacharyya (Journal of Biological Chemistry, 2013. 288(42): p. 30559-30570) brought back a result obtained in my lab many years ago and got me thinking about how small peptides can affect protein aggregation.

So, first the unexpected result from many years ago:

Marker proteinAt the time we were studying a small peptide called Stp, which was able to switch off a complex restriction and modification enzyme called EcoprrI, but the researcher carrying out this work made an unexpected observation that Stp peptide, when added to a group of proteins of different sizes (used as size markers) altered their apparent size and even aggregated many of the proteins (you can see this aggregation in the wells at the top of the gel).  This peptide was found to be able to inhibit certain protein-protein interactions (later we realised that is how it prevented the restriction enzyme from working), but clearly it could also affect the behaviour of other proteins in a gel.  The effect was primarily aggregation, but the result made me think at the time that maybe amphipathic peptides might also influence, even disrupt, protein-protein interactions.  We had just observed that the EcoR124I restriction enzyme could dissociate as a means of controlling function and I wondered if Stp would enhance that dissociation – low and behold Stp did indeed disrupt the subunit assembly of EcoR124I, and EcoprrI; we had demonstrated how the anti-restriction activity of this small peptide worked.

And so, secondly, to the recent observations with amyloidosis:

AmyloidAlzheimer’s Disease is initiated by protein aggregation when β-amyloid (Aβ) peptide oligomerisation into fibril structures that eventually form plaques within the brain.  Disruption of these aggregates would be a very important treatment for Alzheimer’s and is an area of intensive research.  What Bhattacharjee and Bhattacharyya have shown is that a small peptide, found in Russell’s viper venom, not only destabilise the Amyloids, but is also stable in blood for up to 24 hours.  This is a very interesting and promising observation that should stimulate the study of the effect of peptides on protein-protein interactions and perhaps lead to a non-toxic version of the peptide that could be used to treat Alzheimer’s.

Sometimes, it is very interesting how one piece of science can stimulate interest in another, as illustrated above, but also shows how diverse areas of research can sometimes be linked.  Great ideas are not always the result of hard work, but more often arise from interactions between different researchers – keep collaborating people!

Update – Nov. 2015:

In the latest issue of Scientific American under the heading “Advances” they report an article in Nature from work at UCL where autopsies of several patients who died from CJD (the human version of “mad cow disease”, which they acquired from infected growth hormone treatment) where they found evidence of amyloid formation associated with Alzheimer’s – at too early an age for natural onset.  Further work suggests that amyloid precursors, or small clumps of the beta-amyloid, may act in the same manner as prions do in the onset of CJD and lead to Alzheimer’s disease.  It would seem to me that the time is now ripe to begin a serious study of protein miss-folding, aggregation and conformational changes that may trigger these disorders.

Synthetic Biology – will it work?

Eng_Future_Logo_OutlinesEvery now and then science comes up with a new approach to research that impacts on technology, but often these approaches are controversial and the headlines we see are far from the truth and can damage the investment into the new techniques.  One good example is the Genetic Modification of plants and the production of GM-foods, which has a really bad press in Europe despite many obvious benefits for the world economy and for disease control.  The latest technology, which follows from the explosion in genetic engineering techniques during the 1990s, builds on concepts developed in bionanotechnology and is known as Synthetic Biology.  But, what is Synthetic Biology?  Will it work?  And what are the dangers versus benefits of these developments?  Gardner and Hawkins (2013) have written a recent review about this subject, which made me think a blog on the subject was overdue.

My background in this area is two-fold:

  1. I was a part of a European Road-Mapping exercise, TESSY, that produced a description of what Synthetic Biology is and how it should be implemented/funded in Europe.
  2. I was also Project Coordinator for a European research project – BioNano Switch, funded by a scheme to support developments in Synthetic Biology, that aimed to produce a biosensor using an approach embedded in the concepts of Synthetic Biology.

So, what is Synthetic Biology?  I think the definition of this area of research needs to be clearly presented, something that was an important part of the TESSY project, as the term has become associated simply with the production of an artificial cell.  However, that is only one small aspect of the technology and the definition TESSY suggested is much broader:

Synthetic Biology aims to engineer and study biological systems that do not exist as such in nature, and use this approach for:

  • achieving better understanding of life processes,
  • generating and assembling functional modular components,
  • developing novel applications or processes.

syntheticBiologyThis is quite a wide definition and is best illustrated with a simple comparison – in electronic engineering there exists a blueprint (circuit diagram) that shows how components (resistors, capacitors etc.) can be fitted together in a guaranteed order to produce a guaranteed result (a device such as an amplifier).  The Synthetic Biology concept would be to have a collection of such components (DNA parts that include promoters, terminators, genes and control elements; cellular systems including artificial cells and genetically engineered bacteria capable of controlled gene expression; interfaces that can connect biological systems to the human interface for useful output).  This would mimic the electronic situation and provide a rapid mechanism for assembly of biological parts into useful devices in a reliable and predictable manner.  There are many examples of such concepts, but the best known is the Biobricks Foundadtion.  However, at the TESSY meeting I was keen to make it clear that there are fundamental problems with this concept, so what are the problems?

At its most simple concepts a Biobricks database would consists of a number of different types of DNA (promoters, are short DNA sequences that switch a gene on; terminators, are short DNA sequences that switch a gene off; control elements, are DNA sequences that control the promoter switching on or off a gene as required; genes, would be DNA sequences that produce Recombinant DNAbiotechnologically useful products; and cells, are the final package that enables the DNA to do its work and produce the required product), which sounds logical and quite simple.  However, biological systems are not as reliable as electronic systems and combinations of promoters and genes do not always work.  One of the major problems with protein production, using such artificial recombinant systems, is protein aggregation resulting in insoluble proteins that are non-functional.  In addition, there are many examples (usually unpublished) of combinations of Biobricks that do not work as expected, or if used in a different order also result in protein aggregation, none of which ever happens with electronic components.  The reasons are far from clear, but are closely related to the complexity of proteins and the need for them to operate in an aqueous environment.  My thoughts about how to deal with this situation is to have a large amount of metadata associated with any database of Biobricks, which includes information about failures or problems of protein production from specific combinations.  However, I am not aware of any such approach!

Synthetic CellThere are other aspects of Synthetic Biology that do not depend on Biobricks and one example is the artificial cell.  The ideal for such a system is a self-assembling package, capable of entrapping DNA, capable of replication and survival and able to produce useful biomaterials and significant steps have been made toward such a system.  However, one area of concern as such systems are developed, is containment – can we really be sure these artificial microbes will remain in a contained environment and not escape to interact with and possible change the natural bacterial population.  However, the power and capability of such a system should not be underestimated and the likely use in future medicine could be immense – simple examples would be as delivery systems for biomaterial that can activate cellular changes by targeting to the required cell and then switching on protein production (e.g. hormones).  This type of targeted medicine would be a major breakthrough during the later part of this century.

SEN25_BIO11Another type of Synthetic Biology involves the artificial assembly (possible self assembly) of biomaterials onto an artificial surface in an way that is unlikely to occur naturally, but provides a useful device – I see this as more like what a Biobricks project should be like – such a system is usually modular in nature and the bio-material would normally be produced using recombinant techniques.  The research project I mentioned earlier involved such a device and the outcome was a single molecule biosensor for detecting drug-target interactions at the limits of sensitivity.  The major issues we had with developing this device was the precise and accurate attachment of biomaterial, to a surface in such a way that they function normally.  However, overall the project was successful and shows that a Synthetic Biology approach has merits.

What are the benefits that Synthetic Biology can provide society?  Well, one advantage is a more systematic approach to biotechnology, which to date has tended to move forward at the whim of researchers in Academia or industry.  Assuming the problems, associated with protein production, mentioned above can be better understood then there could be a major boost in use of proteins for biotechnology.  In addition, Synthetic Biology techniques offer a unique opportunity for miniaturisation and mass production of biosensors that could massively improve medical diagnosis.  Finally, artificial cells have many future applications in medicine, if they can be produced in a reliable way and made to work as expected:

  1. They could provide insulin for diabetics.
  2. Be made to generate stem cell, which could be used in diseases such as Alzheimer’s and Huntingdon’s.
  3. They could deliver specific proteins, drugs and hormones to target locations.
  4. They could treat diseases that result from faulty enzyme production (e.g. Phenylketonuria).
  5. They could even be used to remove cholesterol from the blood stream.

However, there are always drawbacks and risks associated with any new scientific advance:switch%20off

  1. Containment of any artificial organism is the most obvious, but this enhanced by the possibility of using the organism to produce toxins that would allow its use as a biological weapon.
  2. The ability to follow a simple “circuit diagram” for protein production, combined with a readily available database of biological material, could enable a terrorist to design a lethal and unpredictable weapon much more complex and perhaps targeted than anything known to date.
  3. Inhibit research through a readily available collection of materials that prevent patent protection of inventions.  This could be complicated by the infringement of patents by foreign powers in a way that blocks conventional research investment.
  4. Problems associated with the introduction of novel nano-sized materials into the human body, including artificial cells, which may be toxic in the long term.

My own feeling is that we must provide rigorous containment and controls (many of which already exist), but allow Synthetic Biology to develop,  Perhaps there should be a review of the situation by the end of this decade, but I hope that the risks do not materialise and that society can benefit from this work.

Protein aggregation being seen as important?

I have just come across a paper that summarises some of my views on what should be important research. Murphy and Roberts (Biotechnology Progress, 2013, 29(5): p. 1109-1115) have made the observation that for many years protein aggregation has been seen as a nuisance factor that prevents high yields in production of protein from recombinant sources, or is a hassle with certain purification methodologies.  However, the understanding of the mechanism behind prion misfolding, that leads to BSE and CJD diseases, and amyloid formation, that is involved in Alzheimer’s disease and other diseases, have shown that this subject should never have been ignored for so long.

Protein aggregationSo what is protein misfolding?  As the diagram shows, when a protein is first synthesised by the ribosome it immediately begins to fold as hydrophobic amino acids are incorporated (these small components of proteins that are strung together to form the protein can be either hydrophobic – dislike a water-based environment, hydrophilic – like to be surrounded by water, or neutral).  This hydrophobic core will fold to minimise contact between water and the hydrophobic amino acids.  However, the process is not simple and many other proteins and factors can be involved, including chaperone proteins that help the native protein to fold correctly, and when a protein is produced in very large amounts this folding system may go wrong and the protein may aggregate together as a simple means of avoiding exposure of the hydrophobic core.  Over-production of proteins from recombinant (genetically engineered) sources is a classic starting point for this problem and has for many years been a major issue for the biotechnology industry.  However, for the research scientist there has been a different problem associated with this problem – wasted research time that cannot be published!

I can illustrate this problem with a story from my own work.  I have worked with a multi-subunit protein for many years and a key step in this work was being able to over-produce one key protein (the largest) in milligram quantities.  200-401-385-MALTOSE-BINDING-PROTEIN-MBP-EPITOPE-TAG-Antibody-1-PTH-4x3Once we had found a mechanism to do this we were able to publish the work, but we always looked for a better, or easier, solution.  Interestingly, one such solution was to fuse another protein onto the required protein, which could be used to pull it out of the bulk mixture of all proteins, but you can attach the fusion protein to either end of the protein of interest and yet, when we swapped ends for the attachment the fusion was insoluble due to aggregation.  Yet, this negative result was never published as there is no journal that would encourage such a publication despite the information being really important to others working in the area.

Aggregating510The problem in science is that journals do not publish negative results even though such information can be very useful.  I would love to see some of the more enlightened journals review this policy and start to publish negative results. Maybe the renewed interest in protein aggregation, driven by the importance in disease states, might encourage this new approach.  I have a feeling that there is already a lot of data about the causes and reasons for protein aggregation that could help us understand amyloidosis and other protein-based diseases, but also, more importantly, a lot expertise that could benefit the research in this key area.

You are what you eat?

cover_2013-09The above title is an old quotation that has become something of a common concept as modern Western Society finds obesity to be a major issue.  But, it was reading Scientific American this month that led to me writing this blog…..

The September issue of Scientific America is titled “Special Food Issue” and reading some of the articles has made me think about the thorny subject of healthy eating.  There are a wide range of subjects covered ranging from Bees used to pollinate crops to genetically engineered food.  However, the article I found most interesting was aimed at addressing the issue of obesity, its causes and the complex issue of the calorific value of foods.

So, I will start with this issue of calories – what the article was keen to point out is that the use of calories to label food content is a little arbitrary and sometimes misleading.  There is no attempt to understand how easily any specific food is taken up by the body, how much energy the food consumption may itself involve, or whether gut bacteria are in fact using most or perhaps all of the available energy for their own existence!  While the latter point is a little extreme, it illustrates the problem.  The other important factor is that the calorie is used to label protein, fat, carbohydrate and other energy sources, but, in fact makes little allowance for their respective destiny in the body, which may or may not make them available for energy use.  It is in fact this last point that is really the crux of the problem.

Apparently, there are two theories as to the main cause of obesity in Western society:

  1. What I would call the IN-OUT theory, which is very simplistic and is that if you do not burn all of the calories you consume, you will put the excess calories down as fat and increase your body mass.
  2. The second theory is more complex and is known as The Hormone Control Theory, which suggests that the use of different energy sources and the subsequent storage of energy within the body is controlled by hormones (particularly insulin) and this in turn is triggered by sugar content in blood.insulin-glucose-metabolism

Interestingly, the first theory is the one that is currently the most popular and is used to guide government-based healthcare decisions.  In contrast, the second theory was more popular before the Second World War (WWII), but has become unfashionable – despite having never been properly tested and, therefore, disproved.

My own view is, as with most theories involving biological systems, that both will most likely be true and each will be important in certain 1_dcircumstances.  It is clear that the “couch potato” lifestyle is not healthy and almost certainly illustrates point 1 very clearly.  However, there are many examples of people with a very high body mass index, who have both changed their diet and their exercise regime, yet they fail to loose weight despite what is clearly a major increase in exercise levels.  The importance of insulin-glucose ratios in point 2 is critical and closely links the problem to diabetes and many other diseases.  The presence of glucose in the blood stream induces insulin release from the pancreas, which enables transport of the glucose into muscle and other cells, where it is converted to fatty acids for long term storage.  In diabetics, where insulin production may be non-existent (type 1 diabetes) then hyperglycaemia is quickly observed as glucose levels remain high, or increase.  Type%202%20DiabetesIn type 2 diabetes insulin is produced at lower levels than required, or is faulty and produces a reduced glucose uptake into cells.  However, insulin also has another role in metabolism and that is to regulate fat cells and to prevent release of stored fatty acids as an energy source (this is because insulin presence reflects the availability of blood glucose, which is a more efficient energy supply than fat).  If insulin levels are too high then fat storage will continue and weight gain will result.

It seems likely that highly processed foods contain carbohydrates that provide an exceptionally high glucose level to the blood and that many of the obesity problems of Western Society may reflect a sugar-rich diet rather than just a high calorie diet.  Exercise to burn off calories will have little effect in these circumstances (the body burns calories even without exercise) as insulin levels will remain high if blood sugar levels remain high.

Update (25/2/2015):
A recent paper by Haghighi et al 2015 (Polymorphic variant of MnSOD A16V and risk of diabetic retinopathy, Molecular Biology 49, 99-102) show that oxidation plays a key roles in the development of diabetic retinopathy (a side effect of Type 2 Diabetes that can led to blindness).  This is also an important observation with regard to food as many coloured foods (especially fruit) are rich in antioxidants and this could help reduce the type of effect described in the paper.

So, what does this mean about the food we eat?  Well, it is much more important to look at sugar levels and carbohydrate levels than just calorific value of a food.  A good example are breakfast cereals, where many common “healthy” cereals have sugar levels as high at 30% (30 g per 100g of cereal) and carbohydrate levels around 80%!  However, much lower levels are available with oats (porridge) and Shredded Wheat as examples.  low carbohydrate diets that are rich in protein are a good way to control weight by diet – proteins can be converted to glucose in the liver – but such a simple approach has other problems.  fruit%20&%20vegetables%20pie%20chart-851312We often hear about the “five fruits or vegetables per day”, and that has a great deal of importance in providing a varied diet, but fruits also have high sugar content!  There are two ways to look at this – one is that fructose in the blood does not induce insulin production, two fruits (especially highly coloured fruits) contain antioxidants that are very good for the body (preventing damage from the highly reactive species called radicals that are produced by sunlight.  But, not all is rosy as high fructose levels may induce insulin resistance, which leads to type 2 diabetes!  However, a recent British Medical Journal article, detailed in The Express newspaper, has indicated that eating fresh fruit such as blueberries can reduce the risk of Type 2 Diabetes by a third.  This fits with my own diet and I eat a mixture of fresh fruit every day and, as I have said in previous blogs, probably reflects the presence of antioxidants in coloured fruits.  Importantly, fruit juice was found to have a detrimental effect and increased blood sugar and insulin in most cases.  Vegetables are, of course, low in carbohydrates and will not lead to weight gain.  Bread, potato, pasta, rice and sweets are all a supply of sugar, while meats, eggs and fish are high protein food.

No one is perfect and sticking rigidly to a low carbohydrate diet can be difficult, but awareness always helps!  Try to pick a mixed and varied diet and always exercise as much as you can – even just vigorous walking is a good idea.

High-level equipment and its impact on science

A recent article (Hamrang, et al. Trends in Biotechnology, 2013) made me think about the impact modern technology is having on how scientific research is developing and, in particular, my own experience of applying some of this technology.  I thought it might be interesting to detail some of this technology and how it has influenced my own research and how it might both develop to provide new approaches for the advancement of science and how this will change requirements in teaching.

AFM1A good place to start is SINGLE MOLECULE ANALYSIS a concept I had never thought of in my early research career, but it became a possibility during the 1990s.  The first time I  heard of single molecule analysis was something called a Scanning Tunnelling Microscope, but I could not see uses for this device outside of chemistry as the objects to be visualised were in a vacuum.  However, this device quickly developed into the Atomic Force Microscope (AFM) and the study of biological molecules was soon underway.  This device measures surface topology and can visualise large proteins as single molecules – my first involvement was to visualise DNA molecules translocationthat were being manipulated by a molecular motor.  The resolution was astounding, but more importantly we were able to use this technology to study intermediates that had been biochemically “frozen” in position and resolve features we never expected to see.  Further studies allowed us to also study protein-protein interactions and super-molecular assembly of the motor.  The wonderful thing about this technology is that interpretation of the data has quickly moved from the negativeness of “artefacts” and a lack of faith that images showed what was thought to be there, to a situation where major advancements are possible through direct topology studies.  Developments of this technology are likely to include automatic cell identification, in vivo measurements using fine capillary needles and measurements of ligand-surface target interactions on cells – this could influence drug development and biomedical measurements.  Another developing technology related to AFM is the multiple tip biosensor that can sense minute amounts of material in a variety of situations (a “molecular sniffer” – one use I heard of directly from the developer was for wine tasting/testing!
Magnetic TweezerMy second single molecule analysis involved a Magnetic Tweezer setup which is able to visualise movement of a magnetic bead attached to a single molecule (in our case DNA), which allowed us to determine how a molecular motor moves DNA through the bound complex, but, perhaps more importantly, this led us to develop a biosensor based around this technology that could be used to determine drug-target interactions at the single molecule level and perhaps allow single molecule sensing in anti-cancer drug discovery.  This technology is also closely related to optical tweezer systems that have been used in similar studies and the future is certain to make such technology cheaper and easier to use and their application in biomedical research.  The key to this development will be the increased sensitivity of single molecule studies and how this will enable more detailed understanding of intermediate steps in molecular motion induced by biomolecules. I imagine as newer versions of these devices become more automated, then they will be used as biosensors to study more complex systems that involve molecular motion.  In the short term, it seems to me that there is scope for the application of these devices in understanding protein amyloid formation and stability with a view to determining mechanisms for destabilising such structures.

SURFACE ATTACHED BIOMOLECULAR ANALYSIS.
SPRThe best known system in this category of analytical devices is Biacore’s Surface Plasmon Resonance (SPR), which uses a mass detection mechanism based on changes to the Plasmon effect produced by electrons in a thin layer of gold. We have used this to study protein-DNA interactions and subunit assembly and the technique provides a useful confirmation of older techniques such as electrophoresis. I have been involved in discussions about the application of this technology in the field, but reliability and setup problems remain a problem. In comparison, the Farfield dual beam interferometer can use homemade chips that simplify setup and seems more reliable for similar measurements. Where I see a potential for these devices is in the study of protein aggregation, which has tremendous potential in the study of amyloid-based diseases. This idea sprang from discussions with Farfield about using their interferometer to detect crystallization and would be an interesting project. However, if these devices are to have a major impact in biomedical sciences, they need to be easier to setup, more reliable and smaller.  recent advances are leading SPR toward single molecule sensing (Punj, D., et al., Nat Nano, 2013). I believe the real key to implementing this technology as a biosensor is to incorporate two technologies in the same device. We proposed to have a dynamic system, on an interferometer chip, whose activity would switch off the interferometre when active. This could be used in drug discovery, targeting the drug at two systems simultaneously. If massively parallel systems can be developed, possibly based around laminar-flow, I can see a use in molecular detection of hazardous molecules using either antibodies, or aptamers.

COMPUTER BASED IMAGE ANALYSIS.
cyroEMI have not directly used this technology, but I have seen the results applied to the molecular motor that I have worked with. The value of the system is that cyroEM allows the gathering of many images of a large protein complex, which allows structural studies of systems that cannot be crystallized or visualized using NMR. My feeling is that as computing power increases this technique combined with molecular modelling in silico, will provide structural information for many complex biological systems. The impact of this knowledge will greatly influence the design of drugs and will aid the biochemical analysis of complex systems. My feeling is that further development of this technology will revolve around combining it with other techniques for visualising biomolecules, one I have mentioned before is Raman Spectroscopy, which could allow studies of these complexes in situ another could be single molecule fluorescence (Grohmann, et al. Current Opinion in Chemical Biology). I can easily imagine collaborative research projects that will bring a variety of such techniques to the production of the 3D image of real biological systems isolated from cells. Such research would have to follow existing models of bidding to use such equipment in centres of excellence. Such centres would bring together visualization techniques with single molecule analysis and data from genomics and proteomics. The research lab of the future will depend on much more international collaboration than we have seen up to now!

STUDIES USING NANOPORES.
The current technology in this area divides into two types of nanopores, physical holes in a surface and reconstituted biological pores. I have used a physical nanopore to investigate the separation of proteins from DNA using electrophoresis across the nanopore, the beauty of this system is that it also quantifies the number of molecules crossing the pore. I imagine that such devices will develop using surface attached biomolecules around the pore, which will introduce specificity into the device, but what I would have liked to develop is a dynamic device for ordered assembly of molecules (an artificial ribosome) where the nanopore allow separation of the assembly line and the drive components – such are the dreams of a retired scientist!
nanopore_x616[1]Biological nanopores are the main focus for single molecule sequencing of DNA and the future must be portable, personal sequencing devices (DNA sequencing information must reside with the source of the DNA and for humans this will eventually lead to personal devices. However, the level of available data will be enormous and the growth of the “omics” research will require new ways to store, organise and access this information. A new method for studying biological systems is already underway in which analysis of data allows a better understanding of complex systems. This will eventually become a part of biomedicine and will be supported by personalised medicine.

I was once asked by a student what future Biology holds, and I now know it will be an area of significant growth for many years to come, but this requires the right focus for investment and a new direction for undergraduates in their studies – good luck to those I have taught, who now have to lead these developments.