Potential Use of Steerable Consumer Home Cameras for “local+remote” Laboratory Instruction

Home camera used to view a physics lab experiment.

In this article I report what I’ve learned so far about the possibility of using steerable consumer home monitoring cameras (“nanny cams”) to allow students to work together on laboratory exercises, with one student in the room and one (or more?) participating using the video and audio from the camera.   (~4660 words)

Introduction and Motivation

The COVID-19 pandemic caused all instruction at SUNY New Paltz to move online in March 2020, including labs.  Introductory physics labs were performed by having faculty record videos showing the apparatus and the process of collecting of data, and then students analyzed the data and wrote lab reports. This lost a number of the benefits of student lab work, including both real-time interaction with the equipment and real-time interaction with other students.

Lab students work together as “lab partners” for a number of reasons. One is that equipment can be limited, especially if it is expensive.  Another is that some activities require more than one set of hands.  But even when these considerations don’t apply, it has long been recognized that working together with a lab partner is a valuable part of lab.  Getting an experiment to work requires problem solving and troubleshooting,1 and collaboration makes this easier and more instructive.  Collaborating with a lab partner can be a form of peer instruction, which has long been recognized as a useful tool for teaching physics in both labs and in lecture classes.2

If we are able to have students in the classroom in Fall 2020 they will likely be required to wear masks and to be spread out in the classroom to preserve “social distancing.”  The masks won’t be a problem, but being “spread out” is in direct conflict with working together as lab partners. One potential way to have students work together but preserve distancing would be to have one student working in the classroom and another connected via video chat. A problem with using a standard video chat application is that our laptop computers have a built-in camera that only faces the front of the computer, which does not easily give a view of the equipment. We could add an additional web camera, and then the person in the room could point it at the experiment.  Either way, using the computer for video would provide a static view (which admittedly is better than nothing) and would require the student in the classroom to continually adjust the camera.  A clear improvement would be if the remote student could control the camera themselves to look around as the experiment is being performed.   Unfortunately, cameras which can pan and tilt (and possibly zoom)3 which are compatible with common video conferencing systems like WebEx or Zoom are expensive,4 and we would need one for each lab station.

A note on terminology:  At SUNY New Paltz5  the word “hybrid” is applied to a course which has an in-classroom component and an on-line component.  But this is not very specific. The hybrid courses in our department implement the “flipped classroom” method, where students view content material such as recorded lectures online and then come to class for discussion and problem solving. Having some students in the classroom and others joining remotely via computer is another variation of “hybrid” but is different from a flipped classroom.  I will use the term “local+remote” to specifically mean a synchronous class or activity where some students are physically in the classroom and others join via computer, all at the same time.

Home Monitoring Camera

YI-Cloud Dome Camera 1080P
Figure 1.  YI-Cloud Dome Camera 1080P, a home camera which can pan and tilt.

A potential alternative to an expensive steerable web camera is a commercially available home monitoring camera (aka “nanny cam”).  The model I chose to test, shown in Figure 1, is the “YI-Cloud Dome Camera 1080P,” which sells for about $30 per unit.6 I chose this camera simply because I had previous experience with an earlier static camera from the same company. My family has used that camera to check up on my elderly mother when she was home alone (before the pandemic).  We called it the “Granny Cam.”  Other common uses are to watch pets at home or to keep an eye on a baby in another room. From my prior experience with the static camera it seemed that a similar  camera which is steerable would work well for connecting virtual lab partners.  And contrary to the implication in the name, use of  “the cloud” for storing video is not required.

Network Configuration

Setting up the camera is very easy in a home environment, but more complex on campus, where we use RADIUS authentication for the Wi-Fi network using individual usernames and passwords.  The camera is designed to use WEP or WPA2 authentication (using the SSID of the Wi-Fi network and the (single) password for that network), as is common for most home routers.  The camera has an RJ-45 jack so that it can be connected directly to wired Ethernet, but that only worked after the camera had been paired with my mobile phone app using Wi-Fi.   Since my phone connects to our campus Wi-Fi using RADIUS but the camera cannot, this presented some difficulty.

One way to get around this difficulty is to pair the camera and mobile phone while using a different WPA2 network, and once that is done then the camera can be connected to wired ethernet.  A remaining complication is that Wi-Fi won’t work if that WPA2 network is no longer available (e.g. if you paired the camera to a phone at home and then brought it into the lab).  It is likely we would want to use the wired Ethernet in any case to avoid network interference and congestion from using many of these devices in one room. (But I also found a way around this; contact me for details.)

QR code use for pairing a camera with the mobile app
Figure 2. QR code use for pairing a camera with the mobile app

The pairing process is fairly straightforward.   The camera starts in pairing mode when plugged in the first time, or after you press and hold a reset button on the back, and it gives voice prompts to indicate where it is in the process.   On iOS at least you have to allow the app to obtain your location — perhaps for added security?  The user enters the Wi-Fi network name (SSID) and password, which are encoded into a QR code (see Figure 2) which is displayed on the screen of the mobile device. That QR code is shown to the camera,7 which reads and decodes the QR code and uses the SSID and password to authenticate to the Wi-Fi hotspot.  After only a minute or two the camera is paired to the account used on the mobile app. It can then be accessed using the manufacturer’s mobile app from any device using that same account8, and the camera can also be “shared with family” to someone using a different account (an important feature discussed further below).

Although you cannot pair the camera on wired ethernet it is possible to switch between wired ethernet and Wi-Fi.  Whenever an ethernet cable is plugged in to the device it will switch to the wired connection, and when it is unplugged it will switch to Wi-Fi (if it is able to do so).   I have not measured the time it takes to make these transitions

These kinds of cameras are made to allow a homeowner to see the view from the camera when they are away from home.  In my lab I tested accessing the camera from off campus by turning the Wi-Fi off on my mobile phone and on an iPad and using my carrier’s mobile data network on both to verify that I could access the camera.  Even though I was in the same room, the packets had to get to and from the camera by entering and leaving the campus network, and that worked fine. I later verified that I could view and control the camera when I was 10 miles from campus.  There was a noticeable increase in latency when using the mobile network from a distance.

Gooseneck Mount

Placing the camera on the lab table did not give a good view of the equipment, which is also on the table, so it was necessary to lift the camera up to the eye height of a typical student.  A traditional camera tripod proved to be too tall when placed on the table with legs fully retracted.  The height would be easier to adjust with the tripod on the floor with legs extended, but the spread legs would then take up a lot of room that would otherwise be available to the student working at the table.

Lab desk showing the home security camera mounted on a gooseneck support which is clamped to the table.
Figure 3. Home monitoring camera mounted on a gooseneck support and aimed at a typical physics experiment (for a lab dealing with Ohm’s Law).
Home camera mounted on a gooseneck support.
Figure 4. Home camera mounted on a gooseneck support.

So to position the camera with the right height and direction I used instead an AboveTEK Heavy Duty Aluminum Gooseneck iPad Holder,9 which clamps to the table top, has a spring clamp which holds the camera securely by the base, and can be bent into position to match the eye height of a student but without getting too much in the way of the person working at the table (see Figures 3 and 4).  The camera base was tilted downward so that the remote student could tilt the view downward to look at the table.  Otherwise the downward tilt angle of the camera was too limited, while the upward tilt went all the way to the ceiling (which is not useful).

We might be able to construct our own gooseneck mounts using flexible metal wiring conduit mounted on the table with bench clamps (we have plenty of those) with a custom-made 3-D printed camera mount on the end.

Audio and Controls

A mobile app is available from the manufacturer for both iOS and Android, and testing shows that the interfaces are very similar, which means that documentation and training for students would not have to be different for the two platforms.

The camera has a two-way audio feature which lets the remote observer hear audio from a microphone built into the camera, and to say something through speakers also built into the camera.  There are two different modes for the remote observer to talk.  A button on the screen can be used for “push-to-talk” or “intercom mode,” meaning it has to be pressed and held while talking, which would not be the best configuration for lab partners.  But the settings can be changed to “hands-free mode” so that the button turns the remote observer’s microphone on continuously until pushed again to turn it off.  This would be the best way for lab partners to carry on a continuing discussion throughout the experiment.  The remote student would have to be given instructions on how to change this setting, since it is not the default.

showing the controls when the camera is first selected.
Figure 5. showing the controls when the camera is first selected.

The interface for controlling the camera is easy to use, with some minor complications.   The initial view is “portrait” with the steering  “joystick” control prominently displayed (see Figure 5).  But the view does not switch to a larger “landscape” view simply by turning the mobile device, as some might expect — one has to press the “spreading arrows” button in the lower right corner to shift to that larger view (see Figure 6).  One on-screen button allows the user to turn the audio monitoring on or off (either as a toggle, or push-to-talk mode).  Another lets the user take a still photo, which are saved to the camera roll on their own mobile device, and can also be saved to Google Drive or shared via email or text message.   Students could use this to take a photo of the entire apparatus, or parts of it, or perhaps of just a meter reading.  Another on-screen button lets the user record video from the session (again saved to their own mobile device).  An example of such a video is shown in Figure 7. A student could record the whole lab, or key parts of it, and review that video later.  In the expanded “landscape” view the camera steering controls are not obvious (but are also not in the way).  Tapping a small icon in the upper right of the screen (see Figure  6) expands the steering controls while hiding the other audio/video controls.  Tapping on the screen again will clear away all control icons. It’s all easy to use once you see it.

The steering controls are fairly responsive when one is on the same network as the camera.  A student using the camera from a dorm room or other location on campus should not have any troubles steering the camera.  There is a bit of a lag in the steering controls when accessing the camera from off campus. Tapping the controls and waiting to see the result leads to the desired results.   There may also be an audio lag, but that has not yet been tested.

Camera controls once changed to "expanded" view.
Figure 6. Camera controls,  once the view is changed to the expanded (landscape) orientation.

While the camera does not have a hardware zoom feature, one can zoom in software using the familiar “unpinch” gesture of spreading two fingers on the screen.  The camera has two resolution modes, “SD” and “HD”.  On a good network connection the HD video works well. I have not pressed the limits to see how a poor network connection degrades the video, or measured the bandwidth requirements.

Sharing the Camera

One key feature that will make it easier to use this device for connecting virtual lab partners is that a camera can be “shared with family.” This means that the camera can be paired initially with the account of the lab manager or faculty member who runs the lab and who always maintains control of the device. The video and audio from the camera can then be shared with a student who has a different account (created using their campus e-mail address), but the student cannot accidentally modify or delete the camera, and access to the camera can be revoked once the lab exercise has been completed (if that is deemed necessary).

I will also note that each camera can be given a name, and that name can be changed in the app settings.  We might, for example, change the name whenever a camera is moved to a different lab station.

Figure 7.  Example of data collection, as viewed from the camera. In this case, for an experiment to study friction. Note the manufacturer’s logo watermarked on the video in the lower left corner.

 

History Review Feature

The camera manufacturer has a subscription cloud service for saving recorded video, but this is both optional and unnecessary for our planned use of the camera. One can insert an SD card into the camera, in which case video and audio can be recorded automatically on the camera and played back by the remote observer.  This would, for example, allow the remote student to go back and review something that they missed or wanted to study in more detail.  The interface for this is straightforward: a time “scrubber” is shown at the bottom of the screen, and the user can drag the scrubber back to the desired time to view the recorded video (see Figure 6).  This history review feature is always available to the “owner” of the camera, but must be specifically enabled for the guest account to which the camera is shared — it’s off by default.

We will have to think about how we might use this history review feature, and whether it is worth the additional expense of an SD card for each camera. The camera works fine for real-time viewing without the SD card.  With the SD card it would be possible for a student with access to a particular camera to view not just their own work with their own lab partner, but the work of previous students who used the same camera.  That may or may not be desirable, or worth worrying about.  An instructor who was not present for the lab period could use this feature to verify attendance and participation, or to review student work to give assistance, or to evaluate problems with the equipment or lab documentation.

Desktop Client App

It would be useful for the remote student to be able to see the view from the camera on the larger screen of a desktop or laptop computer.  Until recently, this particular camera manufacturer only provided apps for mobile devices, and my own survey (possibly incomplete) of similar products suggests that this is a property of the commodity market.  It is a selling point for all that they work with manufacturer-specific mobile apps, many with Amazon Alexa, but there is little or no mention of access via web or desktop computer.

Fortunately, this camera manufacturer has recently released apps for both Mac and Windows. Unfortunately, the Mac app failed to run on MacOS 10.15.4 with a pop-up warning saying it “can’t be opened because Apple cannot check it for malicious software.”10   Further testing is warranted, but this is discouraging.   On Windows there is a similar warning when you go to install the software, but you still have the ability to do so. Once started, the app looks a lot like a mobile app, but still with standard Windows controls in the title bar.11

The app on Windows works to view from the camera and to listen to audio, and the controls work to steer the camera, but the intercom feature is missing, so the two lab partners would not be able to talk using the camera microphone and speakers. I also had a problem creating a new camera account on the Windows app, so a mobile device may be required for that.  The desktop app does show multiple cameras “owned” by the same account, so it would be handy for an instructor or lab manager to use to check on the status of all cameras while in use.  But the Windows app also seems to be missing the feature to share a camera with another account, which means that a mobile device would be required to share a camera with a student for the duration of an experiment.  In short, the Windows client app is behind the mobile app in several important ways.

Given all this, we should keep in mind that students might concurrently use video chat software such as Blackboard Collaborate Ultra (our primary video meeting tool for classes), or WebEx (also favored on our campus). In that case the students could talk directly to each other, face to face, and use the video meeting for audio as well.  Using both the steerable camera and the video chat could give the students an even better means of collaboration, and give the remote student an even greater sense of presence during the experiment.  The bandwidth required to support two video streams might be a limiting factor for some students. (TODO: verify that it’s not a limit on campus.)  On the other hand, the bandwidth required for even one video stream might be a problem for a student using a mobile data plan instead of Wi-Fi.

Multiple Remote Users?

Access to a camera can be shared with more than one guest observer, so it might be possible for a person physically in the lab to work with more than one remote lab partner.  Initial tests of several remote observers accessing the same camera showed some intermittent video buffering, but it was not clear that this was actually due to having more than one person viewing the camera — it might have simply been that the camera was too far from the router for a reliable connection.  A later test with the camera connected via wired Ethernet supported multiple users with no observed buffering. Further testing is warranted, but this is encouraging.

If more than one person can connect and use the camera at the same time, then this could also make it possible for an instructor to render assistance to students during the experiment without having to be physically present in the room, thus further supporting social distancing.

Signal Security

According to the manufacturer’s website12  the “requests” between the camera, mobile devices, and servers use secure HTTPS with a “two-way mutual authentication process to ensure that the user’s personal information is not compromised. Each device has its own key and certificate to authenticate with the server.”   The video from the camera to the user device is an encrypted Peer-to-Peer connection so that “only the end user can view the video content. Any possible interception happening during transmission information will only see scrambled and encrypted data.”

I also note that the  sharing system is designed to limit access to the camera to only the “owner” and to someone else the owner can designate via their campus e-mail address. We can also revoke the share after the lab exercise is over, if we need to do so. It is further possible to add a PIN code to the camera, so that the user has to enter that code to access the video stream even if it is shared with them.  Then access security is insured by both something they have (their phone with the app) and something they know (the PIN code).

Other Camera Features

Since this is a camera for home monitoring, it has other features that are probably not useful for the local+remote classroom use case, but some should be mentioned if only to warn the user to disable them lest they get in the way:

  • The camera has a motion detection feature where it automatically points to the source of motion. This does not work well in the lab during an experiment.  It is off by default (TODO: verify) and in any case should be disabled for lab work.
  • The history review feature can also be configured to only record when motion is detected. That might be useful for helping find a particular bit of video based on past activity, not just time, especially if there are long breaks between data-taking sessions.  (It is not necessary to worry about filling the SD card with video, as the device simply records over the oldest previous recordings.)
  • The status light on the front of the camera base can be configured to stay off when the camera is in use.  When it is on, the light flashes if there are network problems, and having the light on would be a reminder to the students that the camera is operating, so this feature should not be used.
  • The image can be rotated upside down, for mounting the camera on the ceiling.  Maybe there is a way to use this to give the remote student a better view?
  • There is a “crying baby” detection feature which one hopes is not necessary (but putting our students through all these complications just to get their degree might trigger some justified crying).
  • The camera has infrared LED lighting which can be used to view and record in otherwise total darkness.  We probably don’t need this feature.
  • The camera microphone can be disabled in the settings.  But since it can also be turned on or off by the primary user controls, totally disabling it would only be useful if we always use video chat in addition to the steerable camera, and it would be confusing if students decided later to use the camera audio.

Other Manufacturers?

I only tested this single camera from one camera manufacturer. I might test others, and I might even post a comparison.  Or maybe not, if we decide that this camera satisfies all our requirements. Because these are commodity devices for home use I expect that similar cameras on the market from other manufacturers will have similar features.   I would welcome reports from readers about which needed features are present or absent in some other make of home camera.

If using these cameras to enable virtual lab partner collaboration works as expected, the market for such devices might get tighter, just as it is tight now for webcams.  In that case using cameras from other manufacturers, and maybe even mixing them, might become a necessity.

Legal Concerns

Even though it appears that there are no major technical problems with using this kind of camera as proposed, there may well be legal or policy hurdles to be surmounted.  At SUNY New Paltz I’m told that installing devices that capture video or audio from a classroom space requires approval from the University Police Department (UPD), and possibly the HR department as well.  I am hopeful that we can get such approval, because we already have lecture capture cameras installed in our labs, and those must have already been approved.

Concern has also been raised that we must insure that we comply with NY State Penal Code Article 250.00, which deals with wiretapping and interception of electronic communications.  One type of crime described there involves a third party who intercepts signals meant for someone else.  If the peer-to-peer video stream between camera and observer is properly encrypted, as the manufacturer claims, then wiretapping and interception by a third party should not be possible, though our IT staff may need to verify those claims of encryption and that it is adequate. Another type of crime applies to someone who sets up “an imaging device to surreptitiously view, broadcast or record a person” in various private situations (NYS Penal Code Section 250.45).  Clearly, lab cameras used as described above are not “surreptitious.”  Furthermore, according to NYS Penal Code Section 250.65  the provisions of §250.45 do not apply to “video surveillance devices installed in such a manner that their presence is clearly and immediately obvious.”   As long as the cameras are as visible to students as they are in the photos above, with the blue light on to show that they are operating, then there should be no problem.  However, it is clear that the University Police and  lawyers will have to render judgement on all this at some point.

The purpose of this article has been to see if there are any technical hurdles that prevent the use of commodity “nanny cameras” from being used to enable virtual lab partner collaboration, before we get as far as involving administrators and lawyers.   So far, so good.

Notes and References

  1. Troubleshooting equipment is an excellent model of the scientific process of forming a hypothesis and then testing it.
  2. See, for example, Peer Instruction, A User’s Manual, by Eric Mazur (Prentice Hall, 1997)
  3. Cameras that can Pan and Tilt and Zoom are called “PTZ” cameras.
  4. A Logitech PTZ Pro 2 Camera costs over $750 on Amazon, or  $850 direct from the manufacturer.
  5. It may be that “hybrid” has the same meaning throughout SUNY, but I have yet to confirm this
  6. The current price for the YI-Cloud Dome Camera 1080P from the manufacturer is $33.99, with free shipping only on orders over $35. The current price on Amazon is $29.99 with free shipping.
  7. I tried saving a screen shot to use later to add another camera, but I later found this doesn’t work.   The information in the QR code is not encoded as plain text, and so likely includes a time or location to prevent just such a “replay attack”.
  8. Beware, it seems that an account on the mobile device and an account on the manufacturer’s web site are not the same thing.
  9. AboveTEK Heavy Duty Aluminum Gooseneck iPad Holder, about $30 from Amazon at the time of purchase, but the price keeps going up.  I also found these useful for making home-made document cameras. I suspect the price has gone up because others have discovered that too.
  10. On an older Mac I was told it require MacOS 10.11 or later. Alas, that machine was too old to run the app.
  11. Watching the libraries that were installed with the Windows app showed standard graphics libraries and the Qt interface library.
  12. How do I ensure the security and privacy of my videos?”  YI Technology Help Center, https://help.yitechnology.com/hc/en-us/articles/234469188-How-do-I-ensure-the-security-and-privacy-of-my-videos-

Using a Document Camera for ‘local+remote’ Instruction

Document Camera Projected

This article provides configuration information for using an in-classroom document camera so that the content it displays is available to both students in the classroom and those joining via video meeting software (such as Blackboard Collaborate Ultra).  These instructions are particular to our Physics Labs, but the general idea should apply more widely. (1050 words)

Introduction and Motivation

The COVID-19 pandemic caused all instruction at SUNY New Paltz to move online in March 2020, after an extended Spring Break.  If we are able to have students in the classroom in Fall 2020 they will likely be required to wear masks and to be spread out in the classroom to preserve “social distancing.” Spreading everyone out in a classroom may pose problems —  in some cases it just won’t be possible to have all enrolled students in the same room if they have to all be at least 2 meters away from each other.

One way to deal with this is what I will call  “local+remote,” which means a synchronous class or activity where some students are physically in the classroom and others join via computer, all at the same time. Students observing or participating remotely might watch via a classroom lecture capture camera (we have these in our physics labs) or a webcam, or maybe even a “nanny cam”.  The particular mode I consider here is that the instructor presents their content using a document camera, and that content is both displayed on the projection screen in the classroom and is shared with remote participants using video meeting software (such as Blackboard Collaborate Ultra).

The instructions presented here apply to the AVer F50-8M document cameras found in the Physics Labs in Science Hall at SUNY New Paltz, used with Blackboard Collaborate Ultra.  This model of document camera is also used in other classrooms in Science Hall, but classrooms in other buildings may be different.  Even so, the general idea may still work with other document cameras.  If you are unsure, ask Instructional Technology Services for assistance, and try everything well before your class starts.

Cable Required

To make this work in our labs required an additional cable connecting the AVer document camera to the computer.  This cable connects to the document camera with USB-mini B, and connects to the computer via standard USB-A (what most people think of as “regular USB”).  Actually, in our labs the cable from this additional cable from the document camera connects to a USB-A extension cable, which then connects to the computer. In SH 157 and SH 159 the extension cable is already routed through the internals of the instructor station from the back of the computer and up through a hole in the desk.   In SH 160 the extension cable is currently external

To use this kind of document camera in any other classroom you will need both these extra cables.  Why?  The video from the document camera normally comes out of the HDMI port to the Crestron system to be routed directly to the projector, bypassing the computer.  The USB-mini B cable takes video output from the document camera to the computer, but the distance from the document camera to the front of the computer is far enough that you likely also need an extension cable (we did).

Also note, not all classrooms on campus have the same make and model of document camera, so you should confirm which kind of cable is actually needed before assuming another classroom is just like our physics labs.   Test everything before you teach your class, and ask Instructional Technology Services for assistance.

Classroom Setup

  1. To begin, turn on the document camera (press the power button).   It takes a while for it to wake up and be recognized by the computer, so do this first.  Verify that the USB cable is plugged in to the back, and (if you can tell) connects the document camera to the computer.

    Figure 1. Crestron Control Panel.
    Figure 1. Crestron A/V control panel with the Display “ON” and “PC” selected as the video source.
  2. Next, wake up the A/V control panel by touching the screen, and then turn on the projection system by pressing the “DISPLAY ON” button (see Fig. 1).  This should cause the projection screen to drop from the ceiling.  Select “PC” as the source for the classroom projection system.
  3.  Plug it your external webcam (if you have one1) and  position it so that your students will be able to see you while you are teaching.
  4. Turn on the computer and log in, then start a browser (Chrome is recommended over Firefox2— at least for now).

    Blackboard icon for opening the controls for a Collaborate Ultra meeting
    Figure 2. Controls icon
  5. Then start the video meeting, using Blackboard Collaborate Ultra.  When you are starting the meeting you will configure your audio and video.  Pay attention here to your choices, because you will be able to select either the webcam or the document camera as the video source, and when you start the meeting you will want the wecam (if there is one).   You might prefer the document camera as the audio source, since you’ll be speaking nearer to it than the webcam once you are presenting from it.

    Blackboard Collaborate Ultra sharing control icon
    Figure 3. Sharing icon
  6. In the video meeting, open the control panel on the right (using the purple left-chevron icon shown in Fig. 2) and select the “sharing” menu (the icon shown in Fig. 3). From the sharing menu select Share Camera. You will then be given the choice to select either the webcam or the document camera.  The
  7. Teach your class using just the document camera to present equations and drawings and other visual content, and both local and remote students can see it, and hear you, and see you.

 

Document camera image projected in a classroom and shared in a video meeting at the same time.
Figure 4. Document camera image projected in a classroom and shared in a video meeting at the same time.

Audio Headset

To be able to hear students who are remote you will likely need a headset, but to be able to hear students in the room you will want to keep one ear uncovered.  One way to accomplish this is to use ear buds with only one of them in your ear (the one with the microphone).   This has the advantage of getting the microphone closer to your mouth.

Perhaps even better is to get a “Truckers” bluetooth headset, which can connect to the computer wirelessly, has a boom mike, and a speaker for only one ear (since truckers need to keep one ear uncovered for safety).   Most let you use either ear.

Either way, you will have to configure the computer to use ear buds or the bluetooth headset for audio input and output  (documentation forthcoming).  Be sure to test this before you teach your first class.

Notes and References

  1. It is possible to teach using just the document camera but not a webcam, but then your students won’t be able to see you, only the image from the webcam.  If you don’t have an external webcam, the AVer F50-8M has a built-in microphone, so you will not need an extra microphone
  2. Firefox has problems sharing an application window (not the whole screen) in Blackboard Collaborate Ultra. NEED TO TEST THIS WITHOUT USING SPHERE2!  Sharing a single application window with Chrome had no problems. Microsoft Edge has not yet been tested.

Using Duo Security Two-Factor Security at SUNY New Paltz

Using Duo Security Two-Factor Security at SUNY New Paltz

SUNY New Paltz is in the process of adding Two-Factor Authentication (2FA) to their adminstrative computer systems, and I have been trying it out.  This is a report on some of the things I’ve learned, such as how to get it to remember you for 5 days without having to accept all third-party cookies.

Two-Factor Authentication

A lot of people are familiar by now with Two-Factor Authentication.   After you log in with a password (something you know) a message of some sort is communicated to you through a secure channel to a device assumed to be under your control (something you have).    You then have to prove that you received this message, to prove that it is really you logging in, not just someone who has stolen your password.

A very familiar example is 2FA on Google accounts.  When you log in with your password, Google sends a 6-digit code number as a text message to your mobile phone. (They will also call you on a voice land-line, if you don’t have a way to receive texts.)  You then type in that 6-digit number to complete the authentication process.   Facebook does something similar, but you use the Facebook app on your mobile device to get the 6-digit code, which changes every few minutes. I highly recommend enabling 2FA on both Google and Facebook.

New Paltz is using a 2FA system from Duo Security, which can work the same way, sending you a 6-digit “passcode” for you to enter as part of the authentication process.  But Duo also offers the option of a “push,” in which the message is sent to an app on a device assumed to be under your control, and only your control.   In that case you can simply push a button on the app to accept the authentication (or another button to deny it).   You don’t have to type in the 6-digit number.   The device can be  a mobile phone, a “dongle” device you carry on your keychain, or even an Apple watch.    Here is the challenge page you will see after you enter your password:

Figure 1. Duo 2FA challenge page

Click on  “Send me a Push” and then press the “Accept” button on your mobile device and you are in.  Easy.

Apple Watch

I have an Apple watch, which makes using Duo 2FA very easy.
After I’ve entered my password I click “Send me a Push,” and a screen on my watch comes up with the name of the site or service to which I’m trying to authenticate, and a button to Approve the connection (See Figure 2).

Duo Security on Apple Watch
Figure 2: Duo Security on Apple Watch

There is another option under that, to Deny the connection, but I have to scroll down for that option. So far I have not accidentally pressed Approve when trying to scroll down to get to Deny but it’s a concern.

I prefer using the watch for authentication, but I have learned that if I have recently been using my iPhone and it is still open then the “push” will go to the phone and will not go to the watch.   That is confusing at first, when you expect the push on your watch and it does not show up there.   Check your phone.

(Maybe they could make it show up on both the phone and watch?)

In order to use the Apple watch app I had to install the iPhone app first, and then open the Apple Watch controls and find the Duo app there and enable “Show App on Apple Watch”.

Third-Party Cookies

When you initiate authentication and the Duo challenge page comes up, there is the option to have the device remembered (and authorized) for 5 more days.   You can see a checkbox for this in the Challenge page in Figure 1.   You can also see the disclaimer that “You need to enable cookies in order to remember this device.”   What they actually mean  is that you need to enable third-party cookies, which are cookies set on your browser from a site other than the one you are visiting.   Even if you have enabled cookies in your browser, you will find that you are unable to check that box if your browser does not allow third-party cookies.

By default, I turn off third-party cookies.   They are used for tracking by advertisers, which I prefer to avoid, and I can’t help but think they are a potential security weakness, though as of this writing I don’t know of any active exploits.  The compromise is that browsers let you make exceptions, blocking most third-party cookies but allowing them from selected sites.  Using Chrome, I found I could enable this 5-day “remember me” feature and still block third-party cookies in general if I made this exception for the site:

[*.]duosecurity.com

The special characters at the beginning are a “wild-card” match pattern, which is necessary because the hostname part of the URL seems to change from session to session.  (In contrast, when I found how to enable Starfish Early Alert with a single exception for third-party cookies the hostname was specific to our campus.)  The same should work with Firefox.

Although I have not finished testing yet, it seems that authorization is based on IP address, which means that if you use Duo 2FA on your desktop computer using one browser, then you are automatically authorized using a different browser.   Does this required checking the “remember me” box or is it automatic?  I am still trying to figure that part out.

Visualization of the Cislunar Potential

Visualization of the Cislunar Potential

In the past few days I’ve been following the playback of the Apollo 11 mission in “real time” (delayed exactly 50 years) at https://apolloinrealtime.org/11/.  Among all the news related to the 50th anniversary of the Apollo 11 moon landing I also found an article from NPR about about 3 different approaches to how to get to the moon, and how the one they used almost wasn’t ( Meet John Houbolt: He Figured Out How To Go To The Moon, But Few Were Listening).  This has caused me to think a bit about how to visualize the gravitational potential between the Moon and Earth (“cislunar”1) and how the path chosen for the mission was somewhat like a car driving up a mountain to go over a pass to a valley on the other side.   As a result, I wrote a short Python script to show the potential as a surface over a 2 dimensional plane, and then “played” with the color map and other effects to highlight that “mountain pass” between the Earth and the Moon.

I’ve included a derivation of the equation for the gravitational potential for those who are interested, but it’s down at the end (for those who are not).  The Python script which was used to create all these images is at http://www.spy-hill.net/myers/astro/cislunar/CisLunar.py.  See the comments in the code for the small changes needed to turn one into the other.

Visualized as a Terrain Map

This first plot shows the potential surface with a color map commonly used for terrain, where sea level is blue, lowlands are green, mountains are brown, and mountain tops are white:

Figure 1: Cislunar potential using a terrain color map.

The idea is that going to the Moon is like climbing a mountain. Going up a hill increases your gravitational potential energy, and the same is true when a spacecraft goes up this surface on the way to the moon.  The blue at the bottom of the left potential well is the Earth, and the dimple up the hill to the right is the Moon.

The x and y coordinates on this figure are measured in units of the radius of the Earth (symbol R). The vertical scale is logarithmic2 to make it easier to see the variation on a reasonable scale. The horizontal spacing is to scale: the Moon is a little over 60 Earth radii from the Earth.   The bottom of each potential well is flat and to scale.  That shows the small size of the Earth and the Moon compared to the distance between them.

Viewing from the side shows the variation in “elevation” a little better:

Figure 2: Cislunar gravitational potential with a terrain color map, side view.

Now you can at least start seeing the idea that going to the moon is like going up a mountain and then down into a valley.

Visualized with Color Contours

The problem with the images above is that the color map changes gradually, so you cannot see the subtle changes at the “mountain pass”.   To help visualize that more clearly I switched to an artificial color map3 which varies between colors more often and uses more distinct colors.  Here is the result:

Figure 3: Cislunar gravitational potential using the ‘prism’ color map to show smaller changes.

Now you can see that the “mountain pass” actually looks like a narrow gateway. This becomes clearer if you zoom in on the moon:

Figure 4: False color contours of the cislunar potential, close up near the moon.

If you want to get up and over by using the least amount of energy (and thereby the least amount of fuel) then you would want to go right through the center, following the yellow V  up to where the color turns just a little green and then turns back (down) to yellow.  This shows that it’s a rather narrow mountain pass, so maybe it’s better to describe it as a gateway.

It is tempting to try to identify the “gateway” point as the Earth/Moon  Lagrangian Point L1, but it’s not, although they are probably close.   At the L1 Lagrangian point the Earth’s gravity is just slightly stronger than the Moon’s, such that an object there would orbit the Earth with the same orbital period as the moon (and thus if left there would simply follow along  with the moon, sort of like flying in formation).   The whole analysis presented here neglects the fact that the moon is actually in motion around the Earth, and to the extent that we are ignoring that motion the mountain pass is essentially the L1 Lagrange point.  To really get it right we would want to add the “effective” potential for an orbiting object.

Visualized with  Contour Lines

One friend I showed this all to had a little trouble visualizing the idea with the false color map, because the colors can look like “bumps”.   So I also plotted the surface using just contour lines, with the following result:

Figure 5: Cislunar gravitational potential near the Moon, using contour lines.

I like how one of the contour lines, which starts out “downhill” from the moon, actually loops back behind the moon.   This representation is the closest to a topographic (“topo”) map, where contour lines show elevation (and thus also gravitational potential energy).

That figure just shows the moon, but when you back out a bit to show both the Earth and Moon it also helps cement the idea of  gravitational potential as elevation:

Figure 6: Cislunar potential of the Earth and Moon.

Derivation of the Gravitational Potential

For completeness, here is my derivation of the expressions used in the script for the gravitational potential.  To start, the gravitational potential (symbol V) is the gravitational potential energy (symbol U)  divided by the mass of the spacecraft.  Dividing by the mass of the spacecraft makes the result proportional to U but independent of m, and thus we can think of it as being a property of the space at that point, independent of what is there.   And to get the potential energy, just as with electricity, multiply the potential by the amount of “charge” (in this case mass) at that position: U=Vm.   Now we just need to compute V.

To get the gravitational potential energy we can start with the force between the Earth or Moon and the spacecraft, which is given by Newton’s Law of Gravitation:

where M is the mass of the planetary body, m is the mass of the space ship, d is the distance between the two (center to center) and GN is Newton’s constant of gravitation.  This is the magnitude of the force — the direction is of course attractive, through the centers of mass of both bodies.   If we integrate this, from the surface of the Earth up to the position of the space ship, we get the gravitational potential energy. Then, to get the potential we divide by the mass m of the spacecraft to get
This equation holds for both the Earth and the Moon (for different values of M), and we can simply add the two to get the total gravitational potential at any position.

The code I used to make the images is available as a github gist.

Notes

  1. The word “cislunar” means “between the moon”, while the word “translunar” means “from the Earth to the Moon”.  https://wikidiff.com/translunar/cislunar
  2. Since the potential is negative, I actually use the logarithm of the absolute value of the potential, and then put the minus sign back in for the downward direction
  3. the ‘prism’ colormap from matplotlib

Summer Solstice 2019 in Wooster Hall

Wooster Hall main staircase at noon on the Summer Solstice of 2019.

As we have done in past years, a small group of those interested came to Wooster Hall to observe the skylight lights cross over the staircase at Solar Noon (at 12:58:16 EDT). This year the actual Solstice (at 11:54 EDT) was very close to the same time, which is not always the case. Raj Pandya, director of the John R. Kirk Planetarium, lead everyone there through the simple calculation of the highest angle of the sun that day.
I set up my network camera to make the following time-lapse video:

The reason we are all supposedly there is to watch the bars of light crossing the upper staircase, but it looks to me like people were more interested in everyone else. Which may be as it should be.

Cleaning Pennies with Taco Sauce

Pennies after cleaning.

I have been collecting old pennies for a science experiment.   (The composition of the penny changed in 1982, which changed the weight slightly, and I will soon have a student exercise that makes use of that weight difference.  Stay tuned…)    I wanted to clean the pennies enough that they were recognizable as pennies, and so that you could clearly read the date, and also so that at first glance you didn’t know if they were older pennies or not.

Doing some reading on the Internet I found that you should not clean pennies if they are old and potentially valuable.   So the one 1940 “wheatie” that I found in the pile will not be the subject of today’s experiment.   The suggestions I found for just getting the oxide layer off were to use a weak acid, like vinegar, and perhaps throw in some salt, which somehow makes the acid work better.

Then I found someone who pointed out that these two ingredients, vinegar and salt, are key components in ketchup, so you should be able to clean pennies with ketchup.  Or they had actually done so.  I don’t remember which, and I don’t have a link, because it doesn’t matter, because it’s a testable hypothesis.   Only I didn’t have any ketchup available in the lab.

But I did come across some taco sauce at dinner, so I decided to put that to the test.   I took 16 rather tarnished pennies and put 8 each into two different brands of taco sauce for 5 or 10 minutes.   See Figure 1.  (I didn’t watch the clock – I had an intervening  conversation with a colleague so that’s only an estimate.)

Pennies soaking in taco sauce
Figure 1: Pennies soaking in taco sauce

During the process it seemed to me that the brand on the right was doing a better job, but in the end I’m not so sure.   I rinsed them off with water and dried them and arranged them with the worst side up (if there was a worse side), and from the photo in Figure 2 I can’t really say one did a better job than the other.

Pennies after cleaning.
Figure 2: Pennies after cleaning.

What is more notable is that there is a wide variation of the  results within each treatment group.   Maybe some of those pennies needed more time, or need a second round of hot sauce?

I have to admit that I was not careful enough to document the pennies before the treatment.   The lines of pennies above each packet were taken from the same source of pennies and show about the same levels of oxidation as the pennies I used, but they are not the same pennies.  So what you can clearly see is that the treated pennies are cleaner and shinier, which was the goal, and both brands of taco sauce did about the same job.

Further investigation is clearly warranted, so I may stop by Taco Bell tonight…

 

 

 

Vernal Equinox 2019

What a difference a year makes. Last year I made a time-lapse video of the Vernal equinox in Wooster Hall by standing next to a wall and taking a photo about once a minute for an hour. This year I have a much nicer video from a raspberry pi camera which captures an image every 5 seconds. Here is the result:

 

I’ll edit this post later to provide more information, but for now I’m posting it in the hopes that the video is useful for tonight’s event, which alas I will miss because I teach a lab during that time. Enjoy!

Added the next day…

Here are some of the details about how this video was created.  You can compare this to how I did it a year ago and see if you think I’ve made progress.

First, the images were all captured by a Raspberry Pi 2 with attached camera, driven by a Python script which is started at boot time (unless there is a mouse or keyboard plugged in to a USB port).    The camera simply takes an image every 5 seconds and saves it.   No further processing is done on the Pi.

Then I take the Pi back to my lab and plug in a monitor, keyboard, and mouse.  The only reason for the mouse is to inhibit starting the camera.   I zip up the images into a tarball and copy that to a memory stick, which I then take to the Mac in my office.

The Mac runs the free version of the software TLDF  (the name comes from “Time-Lapse De-Flicker”).    I drag the files into TLDF, check the box for “Blend” and set it to blend 3 frames, and press “Render”.   It does the rest, and produces an MP4 video.   That’s it.

It’s not as fun as writing my own Python script to do variable duration frames in an animated GIF, but I sure do like the results.

 

YSC-4 Electronic Clock

YSC-4 Electronic Clock

I’ve just completed building a small electronic clock from a kit, the YSC-4 kit from HiLetgo, which I was able to purchase from Amazon for under $9.1   My interest in this kit was to find something simple that is nevertheless good soldering practice for advanced beginners, and I was not disappointed.

Overview

The kit provides practice for a number of things that students should encounter:

  • an electrolytic capacitor  (requires specific polarity)
  • a buzzer (also has specific polarity)
  • a transistor  (three close leads, and requires proper orientation)
  • an IC socket, and the IC itself (oriented by a notch, and soldering close contacts)
  • segmented display digits (orientation and close contacts)
  • 2 momentary contact switches (orientation)
  • a network resistor pack (orientation and close contacts)

This version comes with a wall-wart with USB socket and a USB cord to the power socket.   I have since found a variation from another vendor (without the wall wart), which comes as a two-pack .  Yet another version, which costs slightly less, has just terminal posts for the power, though I think students are more likely to use their creation if it has the USB power cord.

It took me under an hour to assemble, even with a break for a snack. A beginner might take longer, but would have no difficulty. The kit included a piece of paper with a list of components and a circuit diagram, along with (somewhat confusing) instructions on how to set the time and alarms. The kit did not include step-by-step assembly instructions, but since the PCB is well marked it is clear what goes where, and so step-by-step instructions really are not necessary.  The one piece of advice to give to students is to start at the center of the board and work out, to make access to the leads easier when soldering.

Tips and Tricky Bits

Perhaps the trickiest thing in this kit was the  orientation of the network array; it has a dot on one end and markings on the PCB to show which end goes where.  We all missed that at first, so some boards had to have that component removed and resoldered.  Another tricky point was the switches, because it was not clear at first without testing with a meter which contacts are always joined and which are only joined when the button is pressed. Rotating the switches by 90 degrees will be the same as having the buttons always pressed down. As you might be able to see from the photo, the leads go on the sides, not the top and bottom.  It helps to think of the leads as two sets of flat straps that go across the switch from one side to the other.

Another thing that might trip up beginners is the orientation of the segmented display (the decimal points go at the bottom, as does the writing on the bottom side).  Unlike other PCB’s I have worked with, there are no components where you have to guess the orientation

Some other things to note:

  • This clock has a 24 hour display (no 12 hour display).
  • It will chime 3 times on the hour (unless you turn that off).
  • There are two alarms.   When initially turned on, the time  is 12:59 and the two alarms are enabled and set to 13:01 and 13:02.
  • There is no back-up battery, so you have to set the time (and alarms) again if you ever unplug it or it looses power somehow.

The display is very bright, but since the segments in the segmented display are white when not lit it can be hard to read the time from the bare clock face.  You can see this in the photo at the top of this post.  The solution to this is to cover the display with red or grey tinted plastic, so that only the lit red segments are visible.   I had a roll of red “tail light repair” tape which is 2″ wide and it fit perfectly, as shown here:

YSZ-4 clock with red tape over the display

However, we’ve learned not to put the tape on the display until the clock is working, as it obscures the decimal points at the bottom, leading to more problems with the display ending up upside-down.

We have had some success with replacing the buzzer with an LED, though it seems that the LED may eventually burn out, so it might be wise to add a resistor in series.  One student tried to put an LED in parallel with the buzzer, and that failed, but again maybe adding a resistor would make it work (that has not yet been tried).

Operating Instructions

The operating instructions that came with the kit are written in English, but appear to be a direct translation from Chinese and are somewhat confusing.   I found another set of instructions on the net that are also Chinese written in English, but differently.   From those and my own experience I was able to put together these operating instructions:

Switch S1 (on the left) is the Menu button.   An initial long press enters the first menu.  The menu pages are named A, B, C, D, E,  etc., and the menu letter is shown in the first digit of the display.   A short press on S1 takes you to the next menu.   You can only exit the menus by stepping through all of them; a long press will make them step through quickly  (but if you don’t remove pressure at the right time you’ll start the menu list over again).

In each menu, switch S2 (on the right) is the toggle/increment button.   On each menu page, us it to toggle a feature on or off, or to increment a numerical value.   For numerical values you can hold the button down and the count will go up automatically.

The menu pages are:

  • A – Hours, from 00 to 24  (there is no 12 hour option)
  • B – Minutes, from 00 to 59
  • C – Hourly chime.  If enabled the clock will beep 3 times on the hour, but only  between 08:00 and 20:00.
  • D – First Alarm on/off
  • E – First Alarm hour
  • F – First Alarm minutes
  • G – Second Alarm on/off
  • H – Second Alarm hour
  • I – Second Alarm minute

If an alarm is turned off then the menu will skip the hour and minutes items for that alarm. There is no way to exit the menu pages early; you must cycle through all of them to get back to normal operation.

Outside of the menus, a short press on switch S2 will change between displaying hours and minutes or displaying minutes and seconds. While the minutes and seconds are displayed, a long press on S2 will reset the seconds to zero, and then a short press on S2 will start the clock again from 00.

When an alarm is sounding there is no way to turn it off.  You just have to wait for it to finish.

Co-Curricular Transcript

Students at SUNY New Paltz can participate in a 4-Step training program in electronics soldering, where construction of this clock is the 4th step.  Once the clock is shown to work they can a certification added to their co-curricular transcript.  The student must request this certification; the instructor cannot give it without a request.

To request certification go to my.newpaltz.edu and click on “Student Engagement” in the main menu.  Then click on Co-Curricular Transcript in the Student Engagement menu.  In the search form enter “solder” in the Keyword field and press “Search.”   Click on the item and fill in the form.

3D Printed Case

Students can have a case for the clock 3D-Printed at our Hudson Valley Additive Manufacturing Center.  Payment must be made by credit or debit card after you submit the STL file.   The cost is around $1.30.  Under the “Resources” menu on the HVAMC page open the “Submit a Build” item and click on “Students”.    The STL file describing the case is YSZ-4_ClockCase.stl, but it will have to be renamed for submission (see the instructions on the submission form).  It was created with OpenSCAD.

More info…

The chip used in this kit is an Atmel AT89C2051 micro-controller, which is capable of much more than just being a clock. The vendor (or someone) must have flashed the IC with a simple clock program for this kit.  Maybe it would be possible to re-flash it to allow for 12 hour mode. Anybody up for this challenge?

Also, I tried powering it with a single 3.2 Volt coin battery, and that worked initially, but drained the battery very quickly, so it’s not really a viable option.

These instructions by clobber24 on Instructables for a C51 4-Bit Clock  apply, except for the power jack.  That page also links to 3D printed cases.  He printed a battery case for three AAA batteries, which he says worked, but he does not report on battery life.

Notes

  1. in 2019.  The cost is slightly higher now.

Welding Ventilation Estimate

Welding Ventilation Estimate

I have been investigating the requirements for students to be able to weld on campus, which is needed for our Baja SAE team, for projects for our Engineering Senior Design course, and for other various engineering projects.  One of the requirements is, naturally, adequate ventilation.   Specifically1

Adequate ventilation providing 20 air changes per hour, such as a suction hood system should be provided to the work area.

We have considered several shop rooms as a possible welding space, but it’s not clear if they already have sufficient ventilation or what it would take to add enough ventilation capacity.   What I realized today is that it is useful to turn the question around and ask:  for a “standard” amount of ventilation, how big a space can be properly ventilated to obtain 20 air changes per hour?

What is a “standard” unit of ventilation?   I have a regular old box fan in my lab, and I was able to measure the speed of the exiting air using a borrowed anemometer.  Fans like this are ubiquitous on a college campus, so I’ll chose that as the standard.   The dimension are 19″ × 18.5″, for a total area of 2.44 square feet.   I could compute the flow rate (volume/time) by multiplying the area by the speed of the air exiting the fan (in the same linear units!),  but this anemometer was so smart that if I enter the area it automatically gives me the flow rate in cubic feet per minute (CFM).  The flow rate varied with position around the fan, so I took what seemed like a representative average of measurements all over  (we could do this better, but I just need a ball-park estimate).   There are three speeds: low, medium, and high.   The results were:

Low: 1750 CFM,   Medium: 2250 CFM,   High: 2650 CFM

Just to use a rough order-of-magnitude estimate I will use 2000 CFM in what follows (mostly).

Next, I need a unit of volume.   One of the rooms that is being considered for welding is room 008 in the basement of Resnick Hall (RH 008).   That room has a roll-up door which happens to be exactly 8 feet wide and 8 feet tall.  I need a unit of volume, not area, so I’ll imagine a cube that goes 8 feet back from that door, for a total of 8’× 8′ × 8′ = 512 cubic feet.    This is about the size of the smallest PODS storage container, so I will call this a “pod”2 (their container is actually 8′ × 7′ × 7, but this is close enough for our estimate).

The questions then are 1) how many “pods” can a single box fan ventilate (at 20 air changes per hour), and 2) how many pods does it take to match the volume of the room in question?  If the numbers are wildly mis-matched then we  know we can stop there.  If they are close, then we can refine our calculations, or just make sure we add an “engineering margin” to be sure we are over the required capacity.

First, how many “pods” can a single box fan ventilate?  Let’s call that unknown N, and compute it by setting the required ventilation rate equal to the measured rate:

On the left we have the required flow rate for 20 times the volume of N pods (in cubic feet) every 60 minutes.  On the right we have a representative flow rate for a box fan, in cubic feet per minute.  I’ve taken care to use the same units everywhere for time and volume.  Setting these equal and solving for N gives:

The numerical value comes out to be 11.718, which I will round up to 12 pods. (Using 2250 CFM for the “Medium” setting on the fan would give 13 pods.)

But I have to take into account that the ceilings in RH 008 are rather high.   They are certainly more than 8 feet, probably more than 12 feet, and maybe even 16 feet.   Since this is only an estimate, I’m happy to perhaps go over a bit and guess 16 foot ceilings, which means we have to imagine two of these pods stacked on top of each other.   Then the corresponding floor area we can ventilate with one box fan ends up being half the number, or 6 pod “footprints” of 8′ by 8′.

If the floor area of RH 008 is about the same as 6 of these 8′ by 8′ pods, then we are okay with just one box fan.   If it’s twice as large, then we can use two box fans.  If it’s as much as as four times this then we could put 4 box fans across the bottom of the sliding door and have enough ventilation.

If we need multiple box fans across the opening then I imagine they might be in a frame, perhaps with wheels to make it easier to move in and out of place.   The box fans are 19″ wide, and with some allowance for the frame that means we could get as many as 4 across the opening.   That would cover 4×6 = 24 pod “footprints”.

And note that the estimated 2000 CFM for one box fan was closest to (and under) the “Medium” setting.   We can easily re-work this estimate with the fan(s) set on “High”  if needed.  This will give us an estimate for the upper bound of possibility.  Using 4 box fans set to “High” at 2500 CFM would give 24 × 2500/2000 = 30 pod footprints.

My purpose here was to make an estimate to see if we could use one or a few box fans to ventilate a particular room, but the method can easily be applied to any other room, because a box fan provides a reasonable standard of ventilation, and a “pod” of 8’× 8′ × 8′  or with a footprint of 8’× 8′ is a representative unit of volume which one can easily picture in any room – no tape measure required.  We can use this to quickly rule in or out the possibility of ventilating any candidate space.

  1. See https://www.newpaltz.edu/ehs/safety_welding.html .
  2. Though I want to be clear that I am not offering any product or service which competes with those of the PODS company, so I hope they don’t sue me the way they did U-Haul in 2012.

Wooster Hall Rooftop Mystery

Wooster Hall Rooftop Mystery

A few weeks ago I visited Wooster Hall with a time-lapse camera to try to see what happens to the light from the skylight over the main staircase at solar noon on the Winter Solstice.  I was a few days early, but even so, I think I uncovered the basic idea, which you can review in a previous blog post.

The result is that the four columns of light that appear at the bottom of the staircase on the equinoxes now appear on the slanted ceiling near the skylight, and don’t extend down any further.   Here’s a picture (click on it for a bigger view):

Wooster Hall skylight on 18 December 2018
Wooster Hall skylight on 18 December 2018

But as you will notice, there appears to be something in the way, preventing the columns of light from extending all the way downward, especially on the left.    What could that be?  In the original post of the video I mused that perhaps there is something on the roof which is casting a shadow.   Looking at the roof from a nearby building I could see that there are vents on the roof that are near that skylight.   And after that post I heard from the building architect that those vents are necessary to remove smoke in the event of a fire. It’s doubtful they could be moved.  But from that viewpoint I wasn’t sure that these were actually in line with the skylight, and I’m still doubtful that they are the culprit.

Someone else suggested that I could see what is on the roof using Google Maps.  That turned out to be very helpful.  Here’s the view from directly above, with some added markings (click on the image for a bigger view):

Wooster Hall from above (Google Maps).

The skylight is circled in red, and the green line shows my line of sight from the Chemistry building to the Wooster roof.  As indicated by the compass needle at the right edge, vertical on this photo is North, and as you might expect the four openings in the skylight line up with North, rather than with the building.   You can also see the vents near the skylight, the sort-of round things that are to the right and below the skylight.   But note that they are NOT directly below (i.e. South of) the skylight.  This means that they cannot be blocking the light in the way seen in the videos!   Which is what I suspected when viewing them from the Chemistry building.

So what is blocking the light?   I’m going to guess that it’s the roof itself — actually a wall which is a part of the roof.  As you can see from the photo, the roof has several levels (it’s easier to see this from the side view from the other building).   The part of the roof where the skylight is located is higher than the roof farther to the south, and there is a wall dividing the two levels.    You can see this a little better if we zoom in (again, click on the picture to make it (somewhat) larger):

Wooster Hall from above, showing the wall south of the skylight.

The orange line shows the position of the wall, which I suspect is just high enough to block the lower part of the skylight when the sun is at its lowest in the sky, on the Winter Solstice.   If you go back to the picture of the skylight from the inside, it looks like whatever is casting the shadow is larger on the left and sloping down to the right.   But keep in mind that the building is turned away from North, and the skylight image is cast on a slanted ceiling/wall  (which might even be curved).  My guess is that the shadow is actually a horizontal line, caused by the wall on the roof.

And, by the way, if you don’t see that the orange line marks the position of a wall, then go to Google Maps yourself and find this building and select the “satellite” view.   The way Google presents the images they actually change your view slightly as you drag the map, giving a sense of 3D which shows more clearly that the roof has multiple levels.  It’s pretty cool that they can do this without your having to wear 3D glasses.

Is there anything we can do to unblock the sun?   Well, at least it’s not a vent that’s  required for fire safety, but the wall is probably necessary too.   Maybe a section of the wall could be replaced with an open railing  or chain-link fence which would still provide safety to whoever is working up there, but would let the light through to the entire skylight at the Winter Solstice. Or maybe not.

I still want to get up on the roof to try to confirm this conjecture.  By measuring the distance of the bottom of the skylight to the base of the wall, along with the height of the wall, I could determine the position of the shadow of the wall for a given elevation of the sun, and verify that the shadow would reach the skylight. And maybe figure out how much the wall would have to be lowered  (instead of completely removed).   This isn’t over yet,  so stay tuned…