CONFOCALMICROSCOPY Archives

July 2008

CONFOCALMICROSCOPY@LISTS.UMN.EDU

Options: Use Monospaced Font
Show HTML Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Craig Brideau <[log in to unmask]>
Reply To:
Confocal Microscopy List <[log in to unmask]>
Date:
Fri, 11 Jul 2008 14:56:05 -0600
Content-Type:
multipart/alternative
Parts/Attachments:
text/plain (4 kB) , text/html (14 kB)
Search the CONFOCAL archive at
http://listserv.acsu.buffalo.edu/cgi-bin/wa?S1=confocal

Try doing the same thing with a different microscope but similar image
parameters; same bit depth per pixel, number of pixels, and acquisition
times.  I'll bet you see similar performance issues.  We have some homebrew
microscopes that I have worked with where we encountered operating system
timing problems.  I actually had to offload a lot of timing stuff, like the
pixel clock generation, to external hardware (pre-programmed timer/counter
modules with their own hardware clocks) because windows cannot generate a
reliable timing signal in the ms range due to the operating system's event
handling limitations.  Again, you either need dedicated external hardware or
a real-time OS to get reliable control over timing for small timing
intervals, and Windows (and actually most desktop OSes) simply don't provide
this.

Craig


On Fri, Jul 11, 2008 at 2:50 PM, Lambright, Geoffrey <
[log in to unmask]> wrote:

> Search the CONFOCAL archive at
> http://listserv.acsu.buffalo.edu/cgi-bin/wa?S1=confocal
>
> Hi Craig, and list,
>
>
>
> So if it is the case of the computer hardware "shuffling" the data that is
> causing the imprecision in the data timing, does that mean that the timing
> of the actual image acquisition is precise and the variance that Holly sees
> comes from the computer's ability to process and record the captured data?
> Any idea how one could check for that if that was the case?
>
>
>
> Geoff
>
>
>
> *From:* Confocal Microscopy List [mailto:[log in to unmask]] *On
> Behalf Of *Craig Brideau
> *Sent:* Friday, July 11, 2008 1:03 PM
> *To:* [log in to unmask]
> *Subject:* Re: Zeiss 5-LIVE and Timing Issues
>
>
>
> Search the CONFOCAL archive at
> http://listserv.acsu.buffalo.edu/cgi-bin/wa?S1=confocal Hi Holly!  This
> may be dependent on the computer hardware running the `scope.  If it's
> Windows, then all bets are off for timing when you get down to the low
> hundreds or tens of ms.  Basically the OS can't reliably shuffle data around
> fast enough at those speeds; you need a real-time operating system for
> that.  In the case of most multitasking operating systems no single process
> can 'bogart' the CPU, and the time to switch between process threads can
> vary.  What this boils down to is that the computer will take its own sweet
> time handling the data coming in from your microscope depending on what else
> is going on with the computer and the time scale we are talking about.
>
> Craig
>
>
>  On Fri, Jul 11, 2008 at 12:14 PM, Holly Aaron <[log in to unmask]>
> wrote:
>
> Search the CONFOCAL archive at
> http://listserv.acsu.buffalo.edu/cgi-bin/wa?S1=confocal
>
> Dear Confocal Community –
>
>
>
> This question may be very specific to the Zeiss community and even more
> specific to the 5-LIVE users out there.
>
> We find the 5-LIVE to be unreliable/unpredictable/unrepeatable in time
> intervals shorter than 100msec. By this I mean that if we set up a
> time-series in which an image should be taken every 500ms (let's say the
> time required for the image is 30ms), it works well: an image is in fact
> taken every 500ms. However, if we then decide we want a shorter interval,
> 100ms or less (which should be fine given only 30ms for each frame), the
> images are taken at random times, some greater than 100ms, some less:
>
>
>
> Image #
>
> Time Expected
>
> Time Actual
>
> 1
>
> 0
>
> 0
>
> 2
>
> 0.1
>
> 0.0987
>
> 3
>
> 0.2
>
> 0.1998
>
> 4
>
> 0.3
>
> 0.3009
>
> 5
>
> 0.4
>
> 0.4008
>
>
>
> This gets worse when we go to shorter and shorter intervals, for example,
> 50ms:
>
> Image #
>
> Time Expected
>
> Time [ s]
>
> 1
>
> 0
>
> 0
>
> 2
>
> 0.05
>
> 0.0209
>
> 3
>
> 0.1
>
> 0.1064
>
> 4
>
> 0.15
>
> 0.1273
>
> 5
>
> 0.2
>
> 0.2104
>
>
>
> And even worse for shorter…
>
>
>
> I am wondering: Does anyone else see this phenomenon? Have you been able to
> get around it? We are doing electrophysiology in concert with imaging and
> timing is crucial. So far we have not been able to set it up to trigger each
> image because that takes too long and is even less predictable. We would be
> very grateful for anyone who has found a work-around for this problem.
>
>
>
> Thank you!
>
> __________________
>
> Holly L. Aaron
>
> Molecular Imaging Center
>
> Cancer Research Laboratory
>
> University of California Berkeley
>
> 447 LSA #2751
>
> Berkeley, CA  94720-2751
>
> 510.642.2901
>
> 510.642.5741 fax
>
> [log in to unmask]
>
> http://imaging.berkeley.edu
>
>
>
>
>
>
>


ATOM RSS1 RSS2