|
David's Astronomy Pages
Notes - Session 793 (2020-07-06)
|
Notes (S792) |
|
Notes Main |
|
Home Page |
|
Notes (S794) |
|
Main aims
- AstroMain 3.25.7. Test out fixes to bugs seen in last
session (S782, 3.25.6) and test for any repeat of glitches seen in last
session (S782, 2020-07-03)
- Targets. Aquire variety of targets if conditions allowed
including M85 with new Supernova reported in recent (End June) BGS
newsletter
Equipment & Software
- 12" LX200 and ST-10XME
- 80mm APO and
ZWO ASI 178MC
- CCDApp2, TheSky6, CCDSoft5, PHD2, DeviceHub 4.6.1.10, UPB 1.5
- Pegasus
UPB & Motor Focus, TCF-S
- SharpCap 3.2, PHD2 2.6.8
- AstroMain 3.25.7, AstroGuard 1.4.2, AstroPlan 1.15.3, AstroShCap
1.2.22, AstroVOE 1.3.6
(AstroWeather 1.10.8, AstroAllSky 3.6)
Highlights
-
None
Lowlights
-
Dome Driver Issue. Session not completed or shutdown. Job Queue froze
whilst closing dome at 02:04. Dome Properties frozen and
not updating.
-
Supernova in M85. Unable to acquire images of Supernova
in M85 due to cloud
Summary Plots & Logs
Observing Plan |
|
|
Observing Result (Southern
targets added as ToO targets) |
|
|
Dome & Scope Slewing Performance |
|
|
Slew/Centering Performance |
|
|
Guiding Performance |
|
|
|
Sky Conditions (Locate Frames) |
|
|
Night Sky Summary Plot Top axis: Sky Brightness at Zenith (in ADU/s)
Lefthand axis: Local Time (hh LT). Righthand axis: Sun Altitude
(degs)
|
|
|
Pre-Session Weather Forecast (API data from darksky.net)
Pre-Session Weather forecast showed variable skies
range from 5 to 50% cloud cover, light winds (wind
7mph) and a minimum temperature
of 6°C.
In the event conditions were clear and calm through the whole
night, with a minimum temperature of around 4°C
|
|
|
Session Event Log |
Time |
|
Event |
Detail |
23:39:28 |
|
Session Created |
Live Session Created (2020-07-06 S00793,
ImageSaveNum: 793001) |
23:39:49 |
|
Obs.Manager
Started |
Obs.Manager started |
23:39:51 |
|
Obs.Overseer
Started |
Obs.Overseer started |
23:39:55 |
|
Services
Started |
Observatory Services started |
23:40:08 |
|
Scope
Switched On |
Telescope Power has been switched on via UPB
Powerbox. |
23:41:56 |
|
Services
Started |
Observatory Services started |
23:44:05 |
|
Telescope
Connected |
Telescope Connected (TheSky6) |
23:44:24 |
|
Camera1
Connected |
SBIG Camera Connected (set point -15°C) |
23:45:49 |
|
Observatory (Auto) |
Observatory placed in Fully-Automated Mode |
23:45:52 |
|
Session Pending |
Session pending (2020-07-06) |
23:45:54 |
|
Session Initiating |
Session initiating (2020-07-06) |
23:45:56 |
|
Plan
Requested |
Observing Plan requested from AstroPlan (1.15.3) |
23:46:16 |
|
Plan
Loaded |
Observing Plan loaded to queue (2020-07-06 V1 BEST
PLAN, Plan ID: 522) |
23:46:20 |
|
Camera1
Connected |
SBIG Camera Connected (set point -15°C) |
23:46:24 |
|
Telescope
Connected |
Telescope Connected (TheSky6) |
23:46:37 |
|
Session Equilibration |
Session ready for dome & camera equilibration |
23:49:57 |
|
Dome
Opened |
Dome opened (opening time 44s) |
23:50:00 |
|
Equilibration
Started |
Dome Equilibration started |
00:03:13 |
|
Equilibration
Ended |
Dome Equilibration ended (13.3 mins total) |
00:03:15 |
|
Session Running |
Session running |
00:03:17 |
|
Queue
Started |
Observing Queue started (12 targets selected) |
00:04:17 |
|
Focusing
Started-Foc1 |
Foc1 Focusing Started (TCF-S) |
00:06:56 |
|
Focusing
Completed |
Foc1 AutoFocus Completed (Profile No 1) |
00:06:58 |
|
Target
Started (1/12) |
Target started (1/12, HIP 87937) |
00:09:06 |
|
Focusing
Started-Foc1 |
Foc1 Focusing Started (TCF-S) |
00:11:55 |
|
Focusing
Completed |
Foc1 AutoFocus Completed (Profile No 2) |
00:14:24 |
|
Target
Completed |
Target completed (1/12, HIP 87937) |
00:14:26 |
|
Target
Started (2/12) |
Target started (2/12, GCVS CY Lyr) |
00:16:22 |
|
Focusing
Started-Foc1 |
Foc1 Focusing Started (TCF-S) |
00:19:14 |
|
Focusing
Completed |
Foc1 AutoFocus Completed (Profile No 3) |
00:22:47 |
|
Target
Completed |
Target completed (2/12, GCVS CY Lyr) |
00:22:49 |
|
Target
Started (3/12) |
Target started (3/12, M85) |
00:28:31 |
|
Focusing
Failed |
Foc1 Focusing Failed - unable to find a star (TCF-S) |
00:29:46 |
|
Target
Aborted (3/12) |
Target aborted (3/12, M85) due to GRB Alert |
00:29:48 |
|
Target
Started (4/13) |
Target started (4/13, C/2017 T2 (PanStarrs), ToO) |
00:41:10 |
|
Target
Aborted (4/13) |
Target aborted (4/13, C/2017 T2 (PanStarrs), ToO)
due to GRB Alert |
00:41:12 |
|
Target
Started (5/13) |
Target started (5/13, GCVS R CrB) |
00:43:03 |
|
Focusing
Started-Foc1 |
Foc1 Focusing Started (TCF-S) |
00:45:57 |
|
Focusing
Completed |
Foc1 AutoFocus Completed (Profile No 4) |
00:48:48 |
|
Target
Completed |
Target completed (5/13, GCVS R CrB) |
00:48:51 |
|
Target
Started (6/13) |
Target started (6/13, GCVS AM Her) |
00:54:53 |
|
Focusing
Started-Foc1 |
Foc1 Focusing Started (TCF-S) |
00:57:54 |
|
Focusing
Completed |
Foc1 AutoFocus Completed (Profile No 5) |
01:02:48 |
|
Target
Completed |
Target completed (6/13, GCVS AM Her) |
01:02:50 |
|
Target
Started (7/13) |
Target started (7/13, GCVS CI Cyg) |
01:12:54 |
|
Target
Completed |
Target partially completed (7/13, GCVS CI Cyg) |
01:12:57 |
|
Target
Started (8/13) |
Target started (8/13, GCVS SS Cyg) |
01:25:21 |
|
Target
Completed |
Target completed (8/13, GCVS SS Cyg) |
01:25:23 |
|
Target
Started (9/13) |
Target started (9/13, GCVS BL Lac) |
01:38:19 |
|
Target
Completed |
Target partially completed (9/13, GCVS BL Lac) |
01:38:21 |
|
Target
Started (10/13) |
Target started (10/13, GCVS S Cyg) |
01:52:56 |
|
Target
Completed |
Target partially completed (10/13, GCVS S Cyg) |
01:52:58 |
|
Target
Started (11/13) |
Target started (11/13, Berkeley 90) |
02:03:25 |
|
SoftSuspend
Called |
Soft Suspend is called due to Deteriorating
Conditions (reducing clarity) |
02:03:40 |
|
SoftSuspend
Cancelled |
Soft Suspend is cancelled due to Improved Conditions |
02:03:45 |
|
SoftSuspend
Called |
Soft Suspend is called due to Deteriorating
Conditions (reducing clarity) |
02:04:40 |
|
Critical Cloud Alert |
Critical Cloud Alert (Obs.Manager will close the
Shutter) |
02:04:42 |
|
HardSuspend
Called |
Hard Suspend is called due to Critical Conditions
(cloud) |
02:04:45 |
|
Target
Aborted (11/13) |
Target aborted (11/13, Berkeley 90) due to cloud |
02:04:47 |
|
Queue
Paused |
Queue paused for SoftSuspend |
02:04:49 |
|
Session Suspended |
Session suspended |
02:04:51 |
|
Queue
Paused |
Queue pausing for HardSuspend |
02:16:00 |
|
Overseer Intervention |
Obs.Overseer Interventation to make Observatory Safe |
02:44:00 |
|
User Intervention |
User Interventation to abort Job Queue and switch
observatory to manual mode |
02:44:02 |
|
Queue
Aborted |
Job Queue aborted |
02:45:02 |
|
Telescope
Parked |
Telescope parked (parking time 43s) |
02:45:33 |
|
Telescope
Switched Off |
Telescope Power has been switched off via UPB
Powerbox. |
02:51:46 |
|
Services
Stopped |
Observatory Services stopped |
02:57:21 |
|
Program Closed |
Program closed by User |
|
|
Session Alerts |
Time |
|
Alert |
Detail |
-- |
|
No Alerts |
-- |
|
Back to Top
[ Prev
| Next ]
Critical Issues
Major Issues
- Session not completed and normal automated observatory shutdown
didn't execute (this following an issue with Dome Driver at 02:04 and
freezing of Job Queue whilst attempting to close dome shutter).
Session was scheduled to finish at 02:17. Upon checking the status of the
session via website at 02:30 the session appear to have became frozen whilst
waiting for shutter to close for session suspension at 02:04.
Safeguards operated to Close Shutter, Park Dome and Suspend Telescope
Tracking (so Observatory was completely safe), however the session
operations were spoiled. Investigation carried out which provides some
understanding of the problem, but doesn't provide a definitive single cause
or clear solution. See
Investigation -
Session not completed or shutdown (Job Queue froze whilst closing dome at
02:04). Issue was eventually fixed on 2020-10-15 by the
installation of Device Hub 6.5.0.1 which has better exception trapping.
Issue deemed closed on 2020-11-06 after there were no similar incidents
during the first 8 sessions using the new DH version.
Closed 2020-11-06 (Device Hub 6.5.0.1).
Minor Issues
- Y Scale ( 3 to -7 degC) on Focus Temperature Chart is
inappropriate given starting OTA temperature (9.2 degC). It
seems that a temperatue value of 0 degC has been picked up from somewhere
and used for defining the Y Scale and drawing initial cure line (outside
temp 0.0 to 8.4 degC) and initial UPB Env Temp (vcNull to +9.7degC).
It seems the 0.0degC caused the Initial Focus Position to be wrongly
positioned at ~ 3300 rather than ~ 2500. Luckily focus position had been
corrected to the actual higher tube temperature of 9.1 degC by time that
Focus Profile 1 was acquired. First line of data in the session's
monitor file shows Foc (tube) Temp of 0.0 at 23:39:39. Fix
attempted by added an extra condition check so that StartChart is skipped if
OtaTemp = 0.0 (won't work if OtaTemp = 0.01 )
Fix
attempted 2020-07-07 (AstroMain 3.25.8)
- A ToO target was inserted into queue for immediate execution
during execution of Target 3 but was unexpectedly reported as 'deferred' due
to a current ToO. M85 (a must-include target) was running but due to
cloud along western horizon a new target (C/2017 T2 PanStarrs) with better
chance of clear sky was inserted into the job queue as a ToO
for immediate execution, however it was reported as deferred due to current
ToO even though M85 was
not a ToO target (though it was a 'must-include' target). To get M85 to stop
and to get the comet target to execute immediately in its place a user
intervention was made involving using the 'Abort Target' button. This
workaround worked, the M85 target was stopped and the job manager began executing the
T2 comet target. However this intervention shouldn't have been necessary - its unclear why
the comet ToO was 'deferred' rather executed immediately.
Issue eventually investigated 2020-10-04 (see
Investigation - ToO Insertion Issues (S793, S794, S804) and found to be
due to a bug (if TargetOpportunity = True) in 'HandleToO routine that is
causing unexpected messages 'Target ... is deferred due to a current ToO'
when trying to insert a Prioroity 1 (Do Now) Target of Opportunity.
Done 2020-10-05 (AstroMain 3.30)
- Telescope Slews are recorded to be taking a much longer time
that the actual telescope slew. After problem with Dome
Driver at 02:04 when Dome Properties became frozen, telescope
slews for 6 targets (AM Her to Berkeley 90) were recorded as taking 205 to
240s when there actual slew times based on last scope movement are much less
(16-40s). Slew timed out after 240s (4 minutes).
Fix
attempted 2020-07-07 (AstroMain 3.25.8)
- Dome Slews should have been assumed completed after 10s rather
left to timeout. After problem with Dome Driver at 02:04 when
Dome Properties became frozen, dome slews for 6 targets (AM Her
to Berkeley 90) were recorded as taking unknown (vcNull, -99s) time after
timing out after 240s (4 minutes). Code in AstroMain 3.25.7 should have set
bDomeSlewCompleted to true when slewing hadn't been detected to have started
after 10s, however due to an issue with the Dome Driver the
objDome.IsSlewing was continuously reported as True, regardless of the
Dome's actual slewing state (which was stationary). Code could be
modified to detect that Dome hadn't moved in either 60s or in time that
scope had finished slew (whichever is the longest) and set
bDomeSlewCompleted to true to enable the overall slew to finish well ahead
of other inenvitable timeout after 240s.
Fix
attempted 2020-07-07 (AstroMain 3.25.8)
- Certain Wind Direction indicators on 5d Wind Speed graphs are
still being
formed from overposted arrows.
Although overposting on first directional arrow has been fixed in
AstroWeather 1.10.8) The
first window direction indicator will also sometimes show the same problem.
Fixed 2020-07-07 (AstroWeather 1.10.9)
Continuous Improvement
-
Y scale on CCD Temperature chart should probably have been -5 to
-30 degC given initial CCD Temp of -15 degC. The chart
came with a -10 to -35degC range. A -5 to -30 deg C range provides more
working room should CCD Temp move later to -10degC, but still provide room
in case CCD Temp moves to -20degC instead.
Fixed 2020-07-07 (AstroMain 3.25.8)
[ Prev
| Next ]
Back to Top
Issue : Session became frozen at 02:04 and automated observatory
shutdown didn't occur.
Session was scheduled to finish at 02:17.
Upon checking the status of the session via website at 02:30 the session appear
to have became frozen whilst waiting for shutter to close for session suspension
at 02:04
Details : Following a hard suspend at
02:04 the Obs.Manager operated to Suspend the Session and ordered a closure of
the Dome's Shutter. Dome Closure didn't appear to be completed.
At 02:15 the Obs.Overseer recognised that Job Queue appeared to be frozen with
it still paused from HardSuspend at 02:04, and ran the MakeObservatorySafe
routine. Obs.Overseer ran the MakeObservatorySafe routine again at 02:27
and 02:38. In each case the Slew Dome to Park Position (Az 90) was aborted after
3 minutes, with the Dome still at Az 239.8 deg.
After acquiring screen
captures of the Obs.Computer display, user intervention was made 02:44 to abort
the Job Queue & set the Observatory Mode to Manual. 'ParkTelescope'
routine was then run to park telescope / switch off telescope.
Problem was investigated by looking through data from the session
AstroMain Report Logs
Taking 30s image (C)... **
> Observatory Manager Alert Sky conditions have deteriorated (Clarity 26.9, Stars 0)
> Observatory Manager Info Soft Suspend is called (02:03:45)
** Ok [00793094] 01:03:44 (UT) Full 3x3 30s C
Taking 30s image (C)... **
> Observatory Alert Alert 2020-07-07 02:04 (Local)
> Critical Cloud Conditions Alert Observatory Manager will close the Dome Shutter
> Reason To Suspend Info Cloud (State.11) Clarity: 23.9
> Observatory Manager Alert Hard Suspend is called (02:04:40)
> Observatory Manager Info Aborting Current Target
** Failed ()
> Target Aborting ... Ok Target aborting due to cloud
Pausing Queue... Ok Queue paused for SoftSuspend
Session Suspended 2020-07-07 02:04 (Local)
Session: 2020-07-06 Info Session suspended
Pausing Queue... Ok Queue paused for HardSuspend
Close Dome... Closing
> Obs.Overseer Alert Job Queue appears to be frozen. (02:15)
> Obs.Overseer Alert Last IamAlive message was from ExecuteQueue/Paused_ForHardSuspend/ at 02:04
Making Observatory Safe 2020-07-07 02:15 (Local)
Close Dome
Turn Off Tracking.. Ok Telescope Tracking is OFF
Turn off Dome Slaving... Ok Dome Slaving is OFF
Slew Dome to Az 90 Info Slewing dome to 90 deg (Park Position)
Slew Dome to 90 Az Abort Dome slew aborted after 180s (Dome at Az 239.8 degs)
Dome Speed: Info 0.0 deg/s (0.0 degs in 180s)
> Obs.Overseer Alert Job Queue appears to be frozen. (02:27)
> Obs.Overseer Alert Last IamAlive message was from SetTrackingOff/Continue/ at 02:16
Making Observatory Safe 2020-07-07 02:27 (Local)
Close Dome
Turn Off Tracking.. Ok Telescope Tracking is OFF
Slew Dome to Az 90 Info Slewing dome to 90 deg (Park Position)
Slew Dome to 90 Az Abort Dome slew aborted after 180s (Dome at Az 239.8 degs)
Dome Speed: Info 0.0 deg/s (0.0 degs in 180s)
> Obs.Overseer Alert Job Queue appears to be frozen. (02:38)
> Obs.Overseer Alert Last IamAlive message was from SetTrackingOff/Continue/ at 02:28
Making Observatory Safe 2020-07-07 02:38 (Local)
Close Dome
Turn Off Tracking.. Ok Telescope Tracking is OFF
Slew Dome to Az 90 Info Slewing dome to 90 deg (Park Position)
Slew Dome to 90 Az Abort Dome slew aborted after 180s (Dome at Az 239.8 degs)
Dome Speed: Info 0.0 deg/s (0.0 degs in 180s)
Script Aborting ... Aborted (Script aborted by user)
<< User Intervention
Observatory Mode 2020-07-07 02:44 (Local)
Setting Mode ... Ok Manual
Park Telescope 2020-07-07 02:44 (Local)
Park Telescope... Parking Moving scope to near its Park Location using TheSky6
Park Telescope... Parking Parking Scope via objScope.Park command
Park Telescope... Ok Telescope is Parked (43s)
Waiting Ok Powering scope off in 30s
Switch Off Telescope Ok Telescope power has been switched off (02:45:33)
AstroMain ObsPics
Obs Pics show the observatory
open at 02:04 with Dome at Az 240, Scope at Az 148, sky clarity 24 and running
under softsuspend call with camera still taking images of Berkeley 90 . At
02:05 clarity is still <= 24, observatory has moved to Suspended
state, and shutter is closing. At 02:12 clarity at +/- 20 (and protective
relay activated on/off), dome is still showing as closing, scope is tracking at
155 deg, but dome is still at Az 240 . At 02:16 telescope tracking has become
turned off.
Looking further back at earlier ObsPics shows
that the Dome had apparently been 'stationary' at Az 240 since slewing to R CrB
at 00:42. Even though Slaving is On the dome remains at 240 Az when scope
moves to AM Her around 00:49, to CI Cyg around 01:03, to SS Cyg
around 01:13 , to BL Lac around 01:25, to S Cyg around 01:40 and to
Berkeley 90 around 01:53.
- Slew Table shows that in each case the slews
timeout after 240s (4 mins) whilst waiting for Dome Slews that didn't seem to
occur. Dome remains at 239.8 throughout)
- Although Queue Executor reports
large discrepencies between Post-Slew Scope Azimuth and Post-Slew Dome Azimuth
(dome remaining at 239.8 in each case), the results from Image Linking , from
Autoguiding and from the FITS images themselves indicate that the main and guide
telescope are both seeing actual sky with stars, rather than dome roof and
therefore the dome shutter must have slewed according to commands from DeviceHub
that kept the dome slaved to the telescope movements.
AstroGuard
AstroGuard show a problem at around 02:04 with
the dome shutter taking >63s to close. A close shutter command is
issued at 02:14 after conditions became critical, but doesn't record the dome
actually becoming closed. User intervention at 02:50 to Park Dome.
Dome Safety
Ok 2020-07-06 23:49 (Local)
Dome Safety
Ok Shutter Can Open
Dome Safety
Ok Shutter is opening
Dome Safety
Ok Shutter has opened (42s)
Dome Safety
Info 2020-07-07 02:04 (Local)
Dome Safety
Info Dome is closing
Dome Safety
Warning Shutter taken >63s to close
Dome Safety
Warning Shutter taken >68s to close
Dome Safety
Alert Cloud conditions are critical,
2020-07-07 02:14 (Local)
Dome
Safety
Alert 2020-07-07 02:14 (Local)
Dome Safety
Alert Conditions are critical
Dome Safety
Action Shutter will be checked (and closed if necessary)
Dome Safety
Action Shutter is closing
Dome Safety
Alert Cloud conditions are critical,
2020-07-07 02:16 (Local)
Close
Dome
Ok Command Issued
Park Dome 2020-07-07 02:50 (Local)
AstroGuard screen capture at 02:50 still shows dome as 'closing' and with
shutter at Az 239.9 deg.
This was after the dome was visually checked and
found to be Parked and Closed.
This seems to indicate that ASCOM
DeviceHub wasn't giving out live data to AstroGuard, and therefore the problem
is not isolated to just AstroMain program.
(below we see that
DeviceHub Control tab was also displaying frozen data)
Device Hub
Screen capture of Device Hub main window at
02:42 shows the dome connected, open, non-slaved and at Az 239.8 degs, and with
2 Dome Clients. whilst Scope is stationary at Alt 78/Az 157 degs and with
4 telescope clients. Close Shutter and Park Dome buttons on
the Dome Control are both greyed-out (not active)
At some point around 02:51 (and definitately at or before 02:53) when
Services were stopped DeviceHub put up an error window
"DeviceHub caught
a fatal exception during startup. One or more errors occurred".
No information could be gleaned on what the exception or the errors actually
were.
After stopping Observatory software and restarting Observatory Software
again, the DeviceHub appeared to be funtioning normally again and showed the
dome as being closed & parked, which a visual inspection of the observatory
indeed confirmed.
Ascom Trace Logs
Up until 00:41:40 the Pulsar ASCOM Trace Log
is normal with Dome Reads (Get At Park, Get ShutterStatus Get Azimuth etc)
occuring every 5s, except when slewing was read rate increases to every
1s. The Trace Log shows SlewToAzimuth ( 239.8) request at 00:41:11 with
dome reaching this azimuth at 00:41:34. The Dome Reads suddenly stop
at 00:41:40 (Dome at Az 239.8 deg at this point), and there are just Volatile
message lines for the next 90 minutes.
The Trace Log records a CloseShutter request at 02:04:46 (consistent with
HardSuspend at this time), and shows the Shutter then closing (status 3)
at 02:04:47 before becoming shut at 02:05:31 (status 1). The closing time (45s)
is normal. Volatile lines at 02:04:45 indicate that Dome is at Az
239.8 but the image data at this time demonstrate that the dome shutter position
must be c. Az 147 (allowing Telescope to be able to successfully image Berkeley
90 at Az 147 deg)
Trace Log shows SlewToAzimith 90 request at 02:16:25,
and shows dome slewing around to Park Position between 02:16 to 02:17. (This is
consistent with MakeObservatorySafe call from ObsOverseer at 02:15). At 02:19:25
there is an AbortSlew request from the ParkScope routine as it hasn't been any
read request looking to see that Dome has reached the destination azimuth.
At 02:50 there were requests to Close Shutter and to Park Dome (Dome already
closed and at park position by this point)
There are no SlewToAzimuth requests in the Pulsar TraceLog between 00:41:11
(request to Az 239.8) and 02:16:25 (request to Az 90). It is
difficult to see why there were no SlewToAzimuth requests during this time as
the dome was clearly being commanded to different azimuth positions during this
time (as shown by the imaging results).
This points to a
problem with the ASCOM Pulsar Driver / ASCOM Trace Logging rather than the
DeviceHub, but it is unclear if the problem is caused by DeviceHub or is
not.
After restarting Observatory Software/reconnecting to Dome the
Trace log shows normal behaviour with Dome reads (Get Azimuth, Get At Park etc)
occuring every 5s.
Conclusion
A definite issue has surfaced during the last
two sessions (S792 and S793) where DeviceHub, after working ok for at least an
hour, suddenly stops reading data from the Dome Driver (Pulsar Observatories
Dome) leaving Dome Clients and the Device Hub Dome Control tab to be given dome
property information that is left frozen at the point of the last successful
read. After this time calls to Dome Control Methods generally work
(e.g. the dome physically slews to requested azimuth , or physically
open/closes or physically parks) even though they don't lead to any change in
Dome properties. At first sight this appears to be a problem with
Device Hub, however the absence of any recorded SlewToAzimuth
requests in Pulsar Trace Log over a 90 minute period when Dome can be shown to
have moved puts suspicion onto the Pulsar Dome Driver.
Device Hub &
Pulsar Dome Driver worked successfully together in session S790 & S791, so its
unclear what has caused the issue to surface in S792 & S793.
Recommendation
-
Reboot computer - DONE
-
Reboot Dome Rotation System before next session - TBA (doesn't
appear to be crucial as next session was ok using POTH)
-
Run next session using POTH Hub rather than DeviceHub - DONE (no
problems encountered)
-
Run further session using Device Hub - DONE
-
Consider posting note at ASCOM Help Forum
- DONE
-
Note to
Pulsar Observatories 2020-11-17 regarding a number of small issues
- DONE
Update
Issue was eventually fixed on 2020-10-15 by the installation of Device
Hub 6.5.0.1 which has better exception trapping. Issue deemed closed
on 2020-11-06 after there were no similar incidents during the first 8 sessions
using the new DH version.
Closed 2020-11-06 (Device Hub 6.5.0.1).
Back to Top