Index:
[thread]
[date]
[subject]
[author]
From: Tom Laue <Tom.Laue@unh.edu>
To : rasmb@alpha.bbri.org
Date: Thu, 09 Sep 1999 09:03:27 -0400
Analysis of associating systems
Dear RASMBers,
Joel's question about the uniqueness of the model used to analyze
sedimentation equilibrium data is an important consideration, especially
when embarking on the study of an associating system. There have been
several really good responses to his question. There are a couple of points
that should be kept in mind.
1- Exponentials are boring functions with few features to distinguish them.
The sums of exponentials are no less boring, and (unless the exponents are
vastly different) look a lot like exponentials. Put another way, the
deconvolution of sedimentation equilibrium data into the component
exponentials is an ill-conditioned (or ill-posed) problem. In the absence
of being able to monitor the concentration distribution of components
independently, there is no data transformation which will overcome an
ill-conditioned problem. Occasionally one comes across assertions that one
or another method can extract more information in its deconvolution of the
data... this is not possible. The information content of the input data is
the same for all analysis procedures. Therefore, so long as the analysis
procedures are attempting to extract the same amount of information (same
number of paramters), they will run into the same problem of
ill-conditioning. The best solution is to extract the information for
individual components. This is sometimes possible with the absorbance
system (e.g. using 5OHTrp, or deconvolution of data taken at several
wavelengths), and is a major motivation for building fluorescence optics,
where a labeled component may be followed independently. I am missing a key
reference on signal analysis and signal deconvolution. If I find it, I will
pass it along.
2- The only way around the problem is to gather as much data, and data
containing as little noise as possible. Even noise-free data can only be
decomposed into a finite number of exponentials, and small amounts of noise
dramatically decrease the number of parameters that can be extracted. The
optical systems on the XLI are remarkable for what they can do. They can be
made to do better... however, there is a significant cost/benefit
calculation that has to be made. Some groups (e.g. David Yphantis, Walter
Stafford, ours) are pursuing such hardware improvements independently. It
is not clear how to make such improvements available to the wider
community. Suggestions are welcome.
3- The accuracy of the analysis (i.e. for synthetic data, whether you
recover the input parameters) depends strongly on how well the data span
the range where the parameter makes a significant contribution to the
signal. For an association constant, the data must span the concentration
range where there is significant amounts (preferably >=90%) of both
dissociated and associated species. A single experiment consisting of 3
concentrations at 1:1, 1:3 and 1:9 dilutions, and at 3 rotor speeds
covering at least a 2.5-fold range in speeds, will allow the examination of
about a 2-log range in concentrations. For systems with two association
steps, almost always one of them will fall outside such a range. Thus,
Jack's comment about gathering more data is correct. The point above is
that there is no way to analyze your way into certainty- a well-designed
experiment is the correct way to get the necessary information.
4- Discriminating between 1<->3 and 1<->2<->4 associations is the most
challenging analysis. Emory Braswell can address this best. The best case
is to drive the equilibrium as far to the right as possible (again, Jack's
suggestion). This runs into problems if the material is scarce, insoluble,
etc. Furthermore, for a weaker association, the assumption that nonideality
is insignificant may not hold. We have dealt with a number of systems where
there is a very weak association and sufficient nonideality to mask the
true extent of association. In the absence of fixing the monomer mass and
all lower association constants, we have found it extremely difficult and
frustrating to analyze an association in the face of significant
nonideality. The work of Don Winzor, Mike Jacobsen, Peter Wills, Allen
Minton and others show that it can be done, but it ain't easy or pretty.
5- For multiple associations, it is almost always necessary to fix sigma at
the known monomer molecular weight. With clean interference data, sigma can
be floated to see if the returned value is reasonable. I have had poor luck
trying this with absorbance data. Given mass spec data, fixing sigma is
reasonable. Sequence data is OK, if it can be shown that there are no
post-translational modifications. Don't feel guilty about fixing sigma- how
many binding curves are generated in which the experiment is asked to
provide both the association constant and the molar mass of the ligand? If
you are concerned about the uncertainty in sigma, fit the data using a
range of sigmas spanning the uncertainty (e.g. 3% for the vbar of an
unconjugated protein, 10% larger for small charged peptides) and get an
estimate of the uncertainty in the association constants this way.
6- The stoichiometries may not be exactly integral. An assumption in these
analyses is that vbar is constant with association. This may not always be
true- especially if one of the components is a nucleic acid. Changes in
vbar on association will be reflected in the stoichiometry.
7- For heteroassociations (e.g. A+B), the mole ratio of the components
should be varied in addition to the concentration and rotor speed. For the
correct model, the association constants returned should be the same for
the different mole ratios. Furthermore, the thermodynamics of the
individual components must be made, and sigma and any self associations for
the components must be held fixed in the analysis of the heteroassociation.
In short, the proper analysis of these systems is tricky, and requires the
simultaneous analysis of large quantities of high-quality data. Analysis of
single data sets is dangerous. A version of NONLIN for hetero-associating
systems is available for VMS-Alpha, and is in the works for the PC.
8- Use other information and other methods to test or help analyze your
results. It is always dangerous to use any single method to characterize an
associating system. The ill-conditioning problem is one of nature's
cruelest jokes- it crops up in many techniques. To gain confidence in your
analysis, ask if your data is reasonable with respect to what is known
about a system. If a protein binds to a dyad symmetric site, it is more
likely to be in a dimer-tetramer than a monomer-trimer equilibrium (though
the latter is possible). We have found that kinetically determined Km's or
Ki's can be useful confidence builders. In the end, there is no substitute
for corroborating data.
The difficulty in analysis of equilibrium sedimentation data is real and
stems from the nature of the signal. When you are confident of the
analysis, though, you have a correct thermodynamic description of your
system... no caveats are needed and any other experimental result should be
considered in light of that description.
Best wishes,
Tom
--------------------------------------------
Tom Laue
Professor and Director of the Center to Advance Molecular Interaction Science
University of New Hampshire
Biochemistry and Molecular Biology
Rudman-379
46 College Rd.
Durham, NH 03824-3544
Phone: 603-862-2459
FAX: 603-862-4013
---------------------------------------------
Index:
[thread]
[date]
[subject]
[author]