Friday, March 1, 2013

Sending an email with JavaScript and ASP.NET


I recently ran into an issue where I wanted to have an HTML form send an email from a website.  I was building a website using entirely HTML, JavaScript, and CSS.  However, it is not possible to send an email from the client side; it must be handled on the server side.  In order to do this, your two best options are ASP.NET or PHP.  While searching for a solution I found that most people simply suggest building the entire thing in ASP.NET or PHP.  However, I was determined not to do this.  My goal was to write only what I needed in ASP.NET and leave the rest.  I chose to use C# because that was what I was the most experienced with.

After quite a bit of searching, I found the basis of what I needed to get started.  The jQuery JavaScript libraries have an Ajax function that can call an ASP.NET method on the server.  The jQuery.ajax() function has many possible uses but we will just use it to call an ASP.NET function to send an email here.

Let’s start by building our HTML form for the user to fill out and submit.  In this example we have a name, email, and message input along with a submit button.  We leave the action blank and use a class called “validate-form” to type it to the CSS.  We will not discuss validation here.

<form action="#" class="validate-form">
<fieldset>
<input type="text" id="Name" value="Name"/>
              <input type="text" id="Email" value="Email"/>
              <textarea cols="10" rows="5" 
                        id="Comments">Comments</textarea>
              <input type="submit" id="Submit-footer" 
                     value="Send the Message"/>
</fieldset>
</form>

Now let’s create the ASP.NET portion of the project.  Create a new ASPX file and name it “SendEmail.aspx”.  First add a reference to the System.Net.Mail libraries at the top:

using System.Net.Mail;

Now create a new procedure called “SendMessage”.  The new procedure will take parameters for the name, email, and message.  The code below is a pretty basic method of sending an email so I will not go into detail.  Key points are the server name and the email address to send the email to.

[System.Web.Services.WebMethod]
public static void SendMessage(string name, string fromEmail, string comments)
{
const string SERVER = "relay-hosting.secureserver.net";
       const string TOEMAIL = "info@sample.com";
       MailAddress from = new MailAddress(fromEmail);
       MailAddress to = new MailAddress(TOEMAIL);
       MailMessage message = new MailMessage(from, to);

       message.Subject = "Web Site Contact Inquiry from " + name;
       message.Body = "Message from: " + name + " at "
                      fromEmail + "\n\n" + comments;
       message.IsBodyHtml = false;
       SmtpClient client = new SmtpClient(SERVER);
       client.Send(message);
 }

Next create a JavaScript file called “functions.js”.  We will create a function that will kick off when the user clicks the submit button.  There are several ways to do this, so feel free to use whichever works best for you.  First we get the three inputs from the user.  Next we call the jQuery.ajax() function.  The url parameter is the path to the .aspx file and it’s method.  The data parameter is used to feed the input to the ASP.NET method we just created.  The format is {‘parameter1’: ‘userinput’, …}, where “parameter1” is the name of the parameter in the ASP.NET procedure.  In this example, we built the data string first.  The other parameters for the jQuery.ajax() function along with more parameters can be found in the documentation for the function.

$(function () {
    $('.validate-form').submit(function () {
       var name = document.getElementById("Name").value;
       var fromEmail = document.getElementById("Email").value;
       var comments = document.getElementById("Comments").value;
var data = "{'name': '" + name + "', 'fromEmail': '"
           fromEmail + "', 'comments': '" + comments + "'}";

        $.ajax({
            type: "POST",
            url: "SendEmail.aspx/SendMessage";
            data: data,
            contentType: "application/json; charset=utf-8",
            dataType: "json"
        });
    });
});

Finally we  need to add some items to the header of the HTML document.  We need to add a reference to the stylesheet, which we will not discuss here.  Next we need to add reference to two JavaScript files.  First download the jQuery file from here.

<link rel="stylesheet" href="css/style.css" type="text/css" 
      media="all" />
<script src="js/jquery-1.4.2.js" type="text/javascript" 
        charset="utf-8"></script>
<script src="js/functions.js" type="text/javascript" 
        charset="utf-8"></script>

That’s it.  Now when the user clicks the submit button, and email will be generated on the server and sent to the specified email address.  As with all code, there are many ways to do different parts of this example.  Use whichever works best for you.

Monday, February 4, 2013

Creating a Custom Pop Up with a Spatial Query


The next hurdle I ran into while creating the GeoLinx application was creating a pop up that would display the attributes for more than one feature in the same spatial location.  Each job post is geocoded to a city because very rarely is a full address included in the post.  So each post is typically located at the centroid of the city (or sometimes a specific place like a military base).  Because of this, the points stack on top of each other.  The basic pop up implemented with the Google Maps API will only get the information for the point on top.  To overcome this I had to create by own code to select all of the points within a radius of the user’s click.



First I created a Fusion Tables layer to display the points on the map.  I did this simply by following the documentation.  I did suppress the default Info Window, otherwise two pop ups would appear when the user clicks on the point.  I also included a SQL query to not show posts older than a specified date or that have expired.  You can find many good examples of JavaScript functions that will get the current data or a past date.  Finally, I created a new Info Window that would display the attributes for the selected postings.  An Info Window is simply the normal pop up you see in most Google Maps applications.

var ftLayer = new google.maps.FusionTablesLayer({
              suppressInfoWindows: true,
              query: {
                    select: 'GeoCode',
                    from: table,
                    where: "PostDate > '" + currentDate +
                        "' AND (ExpireDate > '" + todaysDate + 
                        "' OR ExpireDate = '')"
              }
});
ftLayer.setMap(map);

var infoWindow = new google.maps.InfoWindow();

The next step was to create a listener for click event of the Fusion Tables layer.  This simply tells the application to “listen” for the user to click on a point in the layer and then do something.  Once again I followed the documentation for this.  I then built a query to select the attributes I wanted to display.  I used a spatial query to select the features within a certain distance of the users click.  Next it is essential convert the query to a URL format.  Then I completed the query string by including the rest of the URL path and my key.   Finally, I created a Google Visualization query using the Google Visualization API.

google.maps.event.addListener(ftLayer, 'click', function (event) {
      var query = "SELECT Title, GeoCode, Organization, " + 
                   "PostDate, URL FROM " +
                   table +
                   " WHERE ST_INTERSECTS(GeoCode, CIRCLE(LATLNG"
                   event.latLng + ", 5000)) ORDER BY PostDate DESC";
      query = encodeURIComponent(query);
      query = 'http://www.google.com/fusiontables/gvizdata?tq='
               query + '&key=’ + key;

      var gvizQuery = new google.visualization.Query(query);

Once the query was created, I sent the query to the server to get back the result.  I also had to create some HTML formatting to display the results in a tabular format.  The Info Window will take straight HTML.  I cut back some of the HTML formatting below to reduce the length of this post but you can research tables for HTML if you need to.  Using the Google Visualization API, I looped through each row of the returned query and build each row in the table.  I also had to do some formatting on the dates.  The final steps were to set the position of the pop up to the user’s click, feed the HTML code for the table to it, and the open the info window.

      gvizQuery.send(function (response) {
         var content = ...//HTML here

         var numRows = response.getDataTable().getNumberOfRows();

         for (var i = 0; i < numRows; i++) {
               var title = response.getDataTable().getValue(i, 0);
               var loc = response.getDataTable().getValue(i, 1);
               var org = response.getDataTable().getValue(i, 2);
               var pDate = response.getDataTable().getValue(i, 3);
               var url = response.getDataTable().getValue(i, 4);

               pDate = postDate.toString();
               pDate = postDate.replace(
                     '00:00:00 GMT-0500 (Eastern Standard Time)'
                     '');
               pDate = postDate.substr(3, postDate.length);

               content = content +
                    + title + ...//HTML here
                    + loc + ...//HTML here
                    + org + ...//HTML here
                    + pDate + ...//HTML here
          }

          content = content + ...//HTML here

          infoWindow.setPosition(event.latLng);
          infoWindow.setContent(content);
          infoWindow.open(map);
      });
});

Overall, I had a difficult time putting this together because it uses code from three Google APIs, and I couldn’t find examples doing exactly this.  But I did learn a lot that I would use in other parts of the application.  Just a note: I’m sure there are other and better ways to do this.  Feel free to comment if you have better solutions.  The next part of the project was to add search functionality that would center and zoom the map to the address or zip code that the user specifies.

Friday, January 25, 2013

Using Fusion Tables for GeoLinx

Building the GeoLinx website was my first experience with the Google Maps API. While I have plenty of experience with HTML, I have only used Javascript on a limited basis (most of my experience has been with C# and Python). So this project has been a learning experience for me. I used a lot of the sample code provided by Google and others users. However, I have run into many instances where I could not find samples to meet my needs, so I had to create a solution on my own. I will continue to post the solutions as I come across them so anyone else with the same problem will have a reference.

Getting a basic map displayed was easy by simply following along with the tutorial provided by Google. The first issue I encountered was where to store the data and how would I display it. In this case the data was the location of job postings. I didn’t want to use a spatial database like SQL Server or PostgreSQL at this point because of the time to set this up as well as the problem of where to host this database. So I looked at KML first. While this would meet my needs, maintaining a KML with frequent updates would take quite a bit of work.



Then I discovered Google Fusion Tables. This seemed like the perfect solution for the time being. Fusion Tables are basically a database table stored on Google Drive and can be manipulated like any other document. In addition, Google provides an API that allows you to access and modify these tables with SQL like any other database table. But the most interesting part about Fusion Tables is that you can add a spatial field like any other spatial database. But the most exciting aspect is that this field can not only store latitude and longitude coordinates, but it can also store a string to be geocoded. So for example after entering “Pittsburgh, PA”, Fusion Tables will automatically geocode this string and convert it to the proper coordinates. This method uses the same geocoding service that Google Maps uses, so you can check where the point where be placed by simply typing the string into Google Maps.

The downside of Fusion Tables is that it is still considered experimental so it is lacking many of the useful features of most databases. One lacking feature is the ability to add an auto-populating unique ID field. There are also many useful SQL functions missing. Fusion Tables does allow for basic spatial SQL queries. These functions are also not nearly as robust as most spatial database but they have met my needs so far.

To get started I created a simple table that keeps track of the position title, the location (usually city and state) for geocoding, the city, state, country, organization, date posted, expiration date (if applicable), the class (technician, analyst, developer, etc), and the URL to the posting. Once this table was set up with some data, I simply followed the API reference do display these points on the map. I added a SQL query to only display jobs that have not expired and were posted within in 60 days so the posting are relevant.



The next step was creating a custom popup window that would display the attributes for all of the points in the same location. Check back soon!

Thursday, January 24, 2013

An Introduction to GeoLinx


The GeoLinx project started out as a simple concept - bringing geospatial industry professionals together through the sharing of information.  The idea started in early 2008 when we realized that searching for jobs in the GIS industry was difficult because there were only a few sources dedicated to this specific discipline of career postings.  At that time we started the "The Western Pennsylvania GIS Newsletter" focusing on GIS jobs in the western Pennsylvania area (where we were located at the time).   The newsletter gained quite a following during it’s time.  Unfortunately, due to time constraints the newsletter fell by the wayside.

During this time, the GeoLinx project was envisioned.  We wanted to create a one-stop site where GIS professionals could find information on professional organizations, educational institutions and programs, GIS firms, career opportunities, and GIS events.  Obviously this would be a large undertaking that would take far more resources than we had at the time.

GeoLinx

Development Faction, LLC was formed in March of 2009.  Over the past several years, we have been growing in size and experience.  In the fall of 2012 we decided to revisit the GeoLinx project as a side project.  We would allocate some time to develop the GeoLinx site when it was possible.  We decided that a phased approach was the best method of getting GeoLinx off the ground.  The obvious target seemed to be GIS job postings.  We decided this would be perfect excuse to play with the Google Maps API.

So that is how the GeoLinx website got its start.  The first version allows users search for job posting within a specified radius.  The user can select jobs and is redirected to the original job post.  A GIS approach for searching for GIS jobs seemed like the perfect approach.  We currently update the site daily (usually), and job posting can be located anywhere in the world, however, are mostly centered in North America since this is where we are located and get the most postings.

Since we have launched the first version of the site, we have been pleased with the positive feedback we have been getting, as well as the large volume of traffic to the site.  We plan to continue to improve the site and already have a long list of enhancements and new features planned.  We hope to get more users sending us posts so we don’t have to search for them ourselves.  We are also posting jobs on Twitter daily.  Finally, we will be following up with regular posts on the project and the development of the site as we go, so check back regularly.

Wednesday, August 29, 2012

Unable to reach ArcGIS Server 10.1 Configuration Manager


For web GIS application installs behind a firewall (intranet) ArcGIS Server communicates with HTTP port 6080 by default (Ports used by ArcGIS Server).  Typically you can navigate to your server local host or loopback IP address of port 6080 to reach the ArcGIS Server Configuration Manager (similar to shown below).
http://localhost:6080/arcgis/manager
http://127.0.0.1:6080/arcgis/manager

But what if the page errors out and you cannot reach the ArcGIS Server Configuration Manager? 
There could be many things causing this problem which are relatively simple to correct, but not always simple to diagnose.  Here are a few recommendations to help diagnose and correct this problem:

1.      The first and most obvious item to check is your Windows Network Diagnostics.  Troubleshooting with Windows Network Diagnostics might help reveal the underlying issue.  You might get an error message like: "The remote device or resource won't accept the connection", or “The device or resource (127.0.0.1) is not set up to accept connections on port “6080”.  (shown below).


This type of error is indicative of an implicitly blocked port or similar issue.  The fix could be as simple as adding an exception to your firewall for port 6080 (TCP).  If that fixes it, great!  If not, read on…
2.       A second item to check would be the "Hosts" file.  In very rare situations this could be the culprit.   Perhaps some network configurations were made or you encountered network problems elsewhere since initially installing ArcGIS Server.  The "Hosts" file could be pointing to an improper IP address, instead of  the desired one.  If this is the case, simply alter the IP address in the "Hosts" file to point back to your local host or loopback IP address for port 6080 and it should be repaired.
Of course there could be many other reasons you are having problems, but since the documentation on this subject is limited, I figured it would be worth sharing some of the solutions we have used.  Hopefully you find this helpful.

Friday, November 12, 2010

Python Tip: Coordinate Clean Up

If you create point features from coordinates stored in flat tables you probably find cases where the longitude and latitude are reversed or a negative sign is missing.  This happens frequently if the source of the data is from a non-GIS user.  What can make fixing these mistakes more difficult is if the errors are not consistent through out the table.  The following Python example will switch the longitude and latitude if necessary and add negative signs if missing.  Note: This code is only useful if all the points are with in the same hemisphere.  If the points are located across the globe, a solution to this problem is much more difficult.

First, import the modules and create the geoprocessor:

import sys, string, os, arcgisscripting

gp = arcgisscripting.create()
gp.OverWriteOutput = 1

Next, create an Update Cursor:

rows = gp.UpdateCursor(r"C:\Project\sample.gdb\table")
row = rows.Next()

Now you need to loop through each row in the table and get the longitude and latitude values.  Note that the XCOORDINATE and YCOORDINATE are the name of the field ins the table:

while row:
     xCoord = row.XCOORDINATE
     yCoord = row.YCOORDINATE

We will need to switch the longitude and latitude if they are reversed.  We do this by checking to see if the longitude is within a valid range.  In this all points should fall with 30 to 50 degrees West Longitude.  If they do not, we assume that they are switched and switch them:

      if xCoord > 30 and xCoord < 50:
          row.XCOORDINATE = yCoord
          row.YCOORDINATE = xCoord
          xCoord = yCoord

We use simpler logic to check for a missing negative sign.  In this case the longitude should be negative so we check.  If it is not, we make it negative:

     if xCoord > 50:
          xCoord = xCoord - (xCoord * 2)
          row.XCOORDINATE = xCoord

Now that the changes have been made if necessary we update the row and move to the next row:

     rows.UpdateRow(row)
     row = rows.Next()

Finally, delete the cursor:

del row
del rows

As you can see the values will change depending on the location of your points, but this general logic should solve most of your problems.  And like all examples, there are several ways to do most of these steps.

Friday, September 17, 2010

Migrating Data from Trimble GPS Pathfinder Office to an ESRI Geodatabase

If you use Trimble GPS Pathfinder Office, you have probably discovered that it does not export data directly into an ESRI geodatabase.  This can be a significant problem, especially if you have a large data dictionary with many features.  However, GPS Pathfinder Office does allow you to export each feature to an individual shapefile.  Using Python, we can quickly move all of the data in the shapefiles into a geodatabase.  The following script will do this automatically for you.  Some important notes, however, are:

1.) This script was written quickly to get the job done.  There are many ways this can be done, and this is only one of them.
2.) The data model of the geodatabase must match the data dictionary exactly.  This is important because the script only works if it does.  Actually, the fields do not have to have the same name, but they need to be in the same order.  You will see below why this is important.
3.) The feature classes in the geodatabase must mach the names of the features in the data dictionary.  For example, if it is called "Roads" in the data dictionary, the feature class shoudl be called "Roads" as well.
4.) All of the feature classes must be in the same data set.

Now let's get started.  First import your modules, create a geoprocessing object, and load the Data Management Toolbox:

import sys, string, os, arcgisscripting

gp = arcgisscripting.create(9.3)

gp.AddToolbox("C:/Program Files/ArcGIS/ArcToolbox/Toolboxes/Data Management Tools.tbx")

Next get a list of all the shapefiles in the output directory from GPS Pathfinder Office:

fileList = os.listdir("C:\\Project\\Data_Dictionary\\SHP")

Now we need to iterate through the list of files in the directory.  We first need to determine if the file is a shapefile by getting its extension:

for i in fileList:
    splitText = os.path.splitext(i)

Now we need to check if the file extension is a shapefile, create and empty string for creating the Append parameter, and start a counter that will be used to keep track of the position of the fields:

    if splitText[1] == fileExt:
        fieldMap = ""
        count = -1

Now we need to get a list of the fields in the matching feature class.  If the shapefile does not match a feature class, it will be skipped:

        try:
            fields = gp.ListFields("C:\\test.mdb\\Data\\" +\
                splitText[0])
        except:
            continue

Let's loop through the fields in the feature class and match them up to the shapefile.  We also need to increase the counter:

        for field in fields:
            count += 1

It is actually not possible to make the data model exactly the same because the feature class requires an OBJECTID and Shape field.  To overcome this we will skip these fields in the field mapping:

            if field.Name == "Shape" or 
            field.Name == "OBJECTID":
                break

We must now get the information from the shapefile:

            else:
                gp.MakeFeatureLayer("C:\\SHP\\" + i,
                    "tempLayer")
                desc = gp.Describe("tempLayer")
                fieldInfo = desc.FieldInfo
               
In the parameter for the Append tool, we need to sepearte each field with a semicolon except for the last field:

                if count > 2:
                    fieldmap += ";"

Now we build our field mapping string from all the information we just collected.  Here you will see we use the counter as the index number of the field:

                fieldMap += field.Name + " '" + field.Name +\
                   "' true true false " +\
                   str(field.Length) + " " + field.Type +\
                   " 0 0 ,First,#,C:\\SHP\\" + i + "," +\
                   fieldInfo.GetFieldName(count) + ",-1,-1"

We can now finish the script by using the Append tool to append the records from the shapefile to the feature class:

        gp.Append_management("C:\\SHP\\" + i,
           "C:\\test.mdb\\Data\\" + splitText[0], "NO_TEST",
           fieldMap, "")

As you can tell, this not necessarily the most efficient way to do this.  I have since rewritten this process using C# to create a more stable tool, however, this fairly simple script can save you quite a bit of time.  I encourage you to modify it and make it work even better.