Global Entry Interview — Site Development Experience

A few weeks ago, I registered for Global Entry.  This is a two-step process. First you have to fill-out the forms for review by the Department of Homeland Security.  Second, you have to appear for an in-person interview with Customs and Border Patrol.

The forms I completed took nearly a week for customs to review.  Then you have to manually register for the interview.  My first available time was June of 2017, OMG!  I called them to confirm and they said, yes this is the earliest.  Your best bet, because people cancel frequently, is to keep checking the website for an opening.  Of course, who has time to check a website eight times a day.  So I wrote a program to crawl the site hourly.  When it finds an opening, it emails me to go reschedule my appointment.

Because this was so successful, some friends said “You have to publish that!”.  I’ve now created a new website at www.globalentryinterview.com.

My original vision was to host a service that would manage the process without any human intervention. This is the story the website tells today.  However, that model requires people to share their username and password for the GOES website.  I’ve since learned that people are very concerned how this creates a security risk.  The data stored on the GOES website is very confidential and it’s hard for me to overcome the security concern.

So, I migrated the application to run as a windows desktop application. Fortunately, I created the original version in a somewhat structured pattern and was able to modify the code to run dual-mode.  Using a switch is morphs between server-based invisible processing or windows desktop application.  It’s very cool.

It took about four weeks to complete the entire application.  The thing I learned the most through this process is how easy it has become to develop very sophisticated applications.  Here are the components I used:

  • I run my own instances of WordPress for the website.  The are hosted on BlueHost.com.  I’ve used them for about five years now.
  • The actually processing service is a .Net application running on Windows hosted by Microsoft Azure.
  • To crawl the site I used Selenium to automate the browsers either on the server or a client computer