nodegoat Tutorials

The tutorials about the virtual research environment (VRE) Nodegoat are intended for the users of Nodegoat Go, which is available for students and research at the University of Bern. But the tutorials also work with other Nodegoat installations. Before you start, take a quick look at the following ultra-short version on DATA MODELING in Nodegoat. Basically, you build your model in Nodegoat with:

1) Objects

2) Sub-Objects where you can store Dates and Locations (for visualisation).

3) Categories

Difference between Objects and Categories? Objects are classified  with Categories. Example: An Object can be a person, and a Category the person’s profession. So easy? Yes. Objects and Sub-Objects can further have descriptions (Text, Images, Links, Relations). Sounds a bit like Excell to you? Yes. Objects are similar to columns in Excell, descriptions similar to rows. And Sub-Objects? This is special in Nodegoat, like you store in Excell Locations and Dates for a column. So if you have data in Excell, it must be quite easy to import it into Nodegoat? Yes, if your data has a clean structure and is consistent.

Attention: the Objects and Categories are named differently in the DATA area than in the MODEL area: DATA: Objects – Categories / MODEL: Object Types – Classifications

For those in a hurry, skip tutorials no. 1-9 and start right away with tutorial no. 10: In this tutorial a short video (no sound, no comments) shows from scratch how to create a simple project with a data model and how to import and visualize data. For the tutorial you need a nodegoat account and the test data set I provide on this website. If you don’t have an account yet, ask your friend where to get one. Or your institution (university), if they provide nodegoat as a digital tool. If you are studying or working at the Faculty of Philosophy and History at the University of Bern, you can get a nodegoat account here: https://forms.gle/Gjm4682EJLsq5TCR7. Or get a student account directly at nodegoat: https://nodegoat.net

 

1. Getting started: create your first Project in ‘Management’ /  add Object Types in ‘Model’ / add Object descriptions in Model / activate the Object Types in ‘Management’ / work with the Object Types in ‘Data’

If you have a new Nodegoat account, just follow the instructions on the screen after login like in the following video to create your first project with 1 Object that has 2 Object descriptions (Name of Object and Comment):

https://drive.google.com/file/d/1BHjuWx9bMO5wGbXFzuYJMKkEdWSkvrTS/view?usp=sharing

 

2. Your first visualization: Locations must be stored in the Sub-Object of an Object. Create a Sub-Object ‘Location’ for your Object in ‘Model’

In this video we will add a field to store locations in our “First Object” that we created in video 1. Locations are not stored in the object description (like the name of the Object), but in the Sub-Object, as you will see in the video. You’ll also see how to change a Location:

https://drive.google.com/file/d/1tH73vB6Pbyt39mzfZok_7mfCw-MTqKNo/view?usp=sharing

 

3. Entering dates: Dates must be stored in the Sub-Object of an Object

In this video we will store a date for our “first Object”. Then we go to “Model” and select “Period” in the Sub-Object so that we can store two dates (start date and end date) in our “first Object”. This means that for each Object in Nodegoat you can choose a point in time or a period of time. In addition, you can also save vague dates, as we will show later in video 4. Attention: the date format goes like this: 1-8-2020 (not 1.8.2020)

https://drive.google.com/file/d/1PSvpt1N6GN9EzjEtBBjapURzQDB9cUhP/view?usp=sharing

 

4. Entering vague dates: Vague dates must be stored in the Sub-Object in Chronology

In this video, a simple example shows how to work with vague dates in Nodegoat. There are many more ways to capture vague dates in Nodegoat, see: https://nodegoat.net/guides (working with temporal data). In the example we do not know the exact start date, but we estimate that it was 5 days after the start we entered earlier (1-8-2020). So we make a statement: ‘5 days after begin start date’. Such vague dates are entered in the Sub-Object. Not as ‘Point’, but as ‘Chronology’, as shown in the video.

https://drive.google.com/file/d/19CgfcGg5ysjgGFyYZsfhzqQoqOQuDC0w/view?usp=sharing

 

5. Importing Locations with CSV data: Geo coordinates must be stored in Sub-Objects

In this video you see how to create a Object Type ‘Locations’ with Object Descriptions that match the column names in the CSV-File. After creating the Object, import the sample CSV data into your Object ‘Locations’. The sample data provides Locations (40k) with geonames.org-IDs and Geo Coordinates: Location1.csv

https://drive.google.com/file/d/1JNKq78e9J6m8d_mtQtx7J-ByV4PiOfSv/view?usp=sharing

Hint: In Nodegoat are about 130k locations from geonames.org preinstalled (Type: City). These locations can be used and extended by all users as a collaborative work.

 

6. Create your first relation: Relations are created in the Model to be used in Data

In this video we establish a relationship between the ‘first object’ and locations, because we want to use the locations as a georeference for the ‘first object’. In Model we select ‘Locations’ as the georeference in the sub-object of the ‘new object’. In Data we enter a location and see that it is not visualised immediately because we have to set the Visual Settings correctly first (selecting Loaction as the reference for visualisation). We’ll change the Location in the ‘first Object’ and see that we have to activate the ‘Quicksearch’ field in Model (in the Object descriptions of Location), so that we can search for a new Location in the Quicksearch field.

https://drive.google.com/file/d/1z4-9wrIn-5-hryOZrbabnBLqdTeMkH5l/view?usp=sharing

 

7. Create your first Classification (Category): Classifications are created in the Model to be used in Data.

In this video we create a classification ‘Attribute’ in the Model. Then we go to the ‘first object’ in the Model to add an Object description ‘Attribute’ that is linked to the created classification.

https://drive.google.com/file/d/1RAbHPi0p09-NFoL7JnCO_zGklpuzCvGS/view?usp=sharing

 

8. Expand your Data Model: Person with ‘Event Birth’.

In this video we will change ‘first Object’ to ‘Person’, adding a Classification in Model ‘Event (kind of’, linking it to the Sub-Object description of Person. Then we add the Event ‘Birth’ to the Sub-Object. This gives us a simple model for biographies that we can extend with other events (such as death, activities, etc.). In the Sub-Object we can add to each of these events dates and a location.

https://drive.google.com/file/d/1NsZknx5vO-YZTPDolW-2ISletlyniApN/view?usp=sharing

 

9. Change the background map

In nodegoat you can integrate background maps as you like, if the maps are available on a tile server via link. So you need only this link to the map. But where can you find such links? Google is your friend. Or this short tutorial. Go in nodegoat to the Visualisation Settings.

Then go to the Visual Settings tab. You will automatically get to the Geographical Settings.

Standard map is the Google Map with its copyright. Remove the Google Map link in the Map field and insert your new link for your map. Change the copyright as desired.

Save your Map settings:

I have compiled some links here. Don’t be afraid of the length of the links, they are just like this, sometimes longer, sometimes shorter:

Google Map

//mt{s}.googleapis.com/vt?pb=!1m5!1m4!1i{z}!2i{x}!3i{y}!4i256!2m3!1e0!2sm!3i336008092!3m14!2sen-US!3sUS!5e18!12m1!1e47!12m3!1e37!2m1!1ssmartmaps!12m4!1e26!2m2!1sstyles!2zcy5lOmd8cC5jOiNmZmY1ZjVmNSxzLmU6bHxwLnY6b2ZmLHMuZTpsLml8cC52Om9mZixzLmU6bC50LmZ8cC5jOiNmZjYxNjE2MSxzLmU6bC50LnN8cC5jOiNmZmY1ZjVmNSxzLnQ6MXxzLmU6Z3xwLnY6b2ZmLHMudDoyMXxzLmU6bC50LmZ8cC5jOiNmZmJkYmRiZCxzLnQ6MjB8cC52Om9mZixzLnQ6MnxwLnY6b2ZmLHMudDoyfHMuZTpnfHAuYzojZmZlZWVlZWUscy50OjJ8cy5lOmwudC5mfHAuYzojZmY3NTc1NzUscy50OjQwfHMuZTpnfHAuYzojZmZlNWU1ZTUscy50OjQwfHMuZTpsLnQuZnxwLmM6I2ZmOWU5ZTllLHMudDozfHAudjpvZmYscy50OjN8cy5lOmd8cC5jOiNmZmZmZmZmZixzLnQ6M3xzLmU6bC5pfHAudjpvZmYscy50OjUwfHMuZTpsLnQuZnxwLmM6I2ZmNzU3NTc1LHMudDo0OXxzLmU6Z3xwLmM6I2ZmZGFkYWRhLHMudDo0OXxzLmU6bC50LmZ8cC5jOiNmZjYxNjE2MSxzLnQ6NTF8cy5lOmwudC5mfHAuYzojZmY5ZTllOWUscy50OjR8cC52Om9mZixzLnQ6NjV8cy5lOmd8cC5jOiNmZmU1ZTVlNSxzLnQ6NjZ8cy5lOmd8cC5jOiNmZmVlZWVlZSxzLnQ6NnxzLmU6Z3xwLmM6I2ZmYzljOWM5LHMudDo2fHMuZTpsLnQuZnxwLmM6I2ZmOWU5ZTll

Grey map without places

//mt{s}.googleapis.com/vt?pb=!1m5!1m4!1i{z}!2i{x}!3i{y}!4i256!2m3!1e0!2sm!3i336008092!3m14!2sen-US!3sUS!5e18!12m1!1e47!12m3!1e37!2m1!1ssmartmaps!12m4!1e26!2m2!1sstyles!2zcy5lOmd8cC5jOiNmZmY1ZjVmNSxzLmU6bHxwLnY6b2ZmLHMuZTpsLml8cC52Om9mZixzLmU6bC50LmZ8cC5jOiNmZjYxNjE2MSxzLmU6bC50LnN8cC5jOiNmZmY1ZjVmNSxzLnQ6MXxzLmU6Z3xwLnY6b2ZmLHMudDoyMXxzLmU6bC50LmZ8cC5jOiNmZmJkYmRiZCxzLnQ6MjB8cC52Om9mZixzLnQ6MnxwLnY6b2ZmLHMudDoyfHMuZTpnfHAuYzojZmZlZWVlZWUscy50OjJ8cy5lOmwudC5mfHAuYzojZmY3NTc1NzUscy50OjQwfHMuZTpnfHAuYzojZmZlNWU1ZTUscy50OjQwfHMuZTpsLnQuZnxwLmM6I2ZmOWU5ZTllLHMudDozfHAudjpvZmYscy50OjN8cy5lOmd8cC5jOiNmZmZmZmZmZixzLnQ6M3xzLmU6bC5pfHAudjpvZmYscy50OjUwfHMuZTpsLnQuZnxwLmM6I2ZmNzU3NTc1LHMudDo0OXxzLmU6Z3xwLmM6I2ZmZGFkYWRhLHMudDo0OXxzLmU6bC50LmZ8cC5jOiNmZjYxNjE2MSxzLnQ6NTF8cy5lOmwudC5mfHAuYzojZmY5ZTllOWUscy50OjR8cC52Om9mZixzLnQ6NjV8cy5lOmd8cC5jOiNmZmU1ZTVlNSxzLnQ6NjZ8cy5lOmd8cC5jOiNmZmVlZWVlZSxzLnQ6NnxzLmU6Z3xwLmM6I2ZmYzljOWM5LHMudDo2fHMuZTpsLnQuZnxwLmM6I2ZmOWU5ZTll

Digital Atlas of the Roman Empire

https://dh.gu.se/tiles/imperium/{z}/{x}/{y}.png

See for this cool project: https://dh.gu.se/dare/

Mercator map from 1607

https://maps.georeferencer.com/georeferences/66a34667-1847-5ea6-b6a8-c81736a3425d/2018-08-26T20:22:32.883884Z/map/{z}/{x}/{y}.png?key=mpcE7jAf5llCJV0hoUfk

The example of the Mercator map refers to the Georeferencer, a service for online maps, where you can find a lot of links to historical maps. Many institutions have their own account at Georeferencer like the David Rumsey Collections with a useful overview of the referenced maps (world map):

https://www.davidrumsey.com/view/georeferenced-maps

Create an account at Georeferencer to get the link for a map provided (for example by the David Rumsey collection). Log in at Georeferencer, choose a map here:

https://www.davidrumsey.com/view/georeferenced-maps

Then go to: ‘This map’ and to ‘get Links’. Copy the link into the Map field of nodegoat.

As another example, the British Library also has an account at Georeferencer. You can find their maps here, on the interactive map:

https://britishlibrary.georeferencer.com/api/v1/density

 

10. Import your Excel data (CSV data), which have geo coordinates (longitude and latitude) into nodegoat and visualize the data on a map

In this tutorial I provide a set of test data with geocoordinates that you can easily import and visualize, like the map below. It shows positions of ships calculated from logbooks (18th – 19th century). The data are especially interesting for historical climate research.

Prerequisite is that you already have a nodegoat account. If you don’t have an account yet, ask your friend where to get one. Or your institution (university), if they provide nodegoat as a digital tool. If you are studying or working at the Faculty of Philosophy and History at the University of Bern, you can get a nodegoat account here: https://forms.gle/Gjm4682EJLsq5TCR7. Alternatively you can get a student account directly at nodegoat: https://nodegoat.net/

The following video (no sound, no comments) shows a step by step guide from scratch. The video starts with the login into your nodegoat account. The next steps are: Create a project, create an object type, download data sample from this website, import and visualize the data:

https://drive.google.com/file/d/1GjEaHBkn7VK37qeA34msFWRjEIRa5b5K/view?usp=sharing

If you prefer written instructions, you can continue here. These instructions are identical to the video, but contain some background information.

Login into your nodegoat account. Import the CSV data into your already existing project or create a new one: ‘climate project’. We will import data (sample) from a cool project about logbooks of ships which are important for weather observations. Here is the website of the project where the data is available:

https://www.historicalclimatology.com/cliwoc.htmlClimatological Database for the World’s Oceans (CLIWOC)

“The database consists of 287,114 logbooks written aboard Dutch, English, French, and Spanish sailing ships. The vast majority of these logbooks date from between 1750 and 1850, yet four ship logbooks were incorporated that predate 1750. These were centuries of European imperial expansion, and so the logbooks record the activities of sailors – both civilian and military – in oceans that span the entire globe.”

I have downloaded the following data: ‘Download as an Open Office Spreadsheet’

I opened the spreadsheet in Excel and first added a column on the far left to add to the records (rows) a unique identifier, because they don’t have any. I’ve added for this the following into field A2 in Excel which contains the first record:

Then double-click on the icon to the right of the cell and it fills the whole column with identifiers. The identifiers are very important. With these identifiers you can later update your data records in nodegoat (‘Update Existing Objects’). I always import the identifiers into nodegoat first and then update the records with additional information based on the identifiers. In the nodegoat Import web interface you can choose whether you want to create new records or update existing ones.

In nodegoat you can import 50k of data records (rows) at once. So if you want to import all the more than 200k data rows, you have to split them up. I’ve already done this by providing a test data sample here with 20k records that you can use for your import. I have prepared this data and just selected a few columns to get started: Identifier, ship name, year, longitude and latitude. You can download the test data sample here:

climate project test data (CSV)

Your Data Model in your project for this data sample should look like this:

Just add one Sub-Object with ‘year + coordinates’

We will import the data into nodegoat via web interface (you can also import data via JSON interface, we will cover this in another tutorial).

Go to Model > Import > CSV Files, there you upload the downloaded CSV file (climate project test data).

Background: To import your data into nodegoat, the data must be available in a text file in UTF8 format. In Excel for example you can save your data as CSV data. Go to ‘Save as’ in your Excel sheet and choose CSV-UTF8 as data format. CSV means comma separated values (data). Open your CSV file with a text editor and you will see the many separators in the data.

Go to the Import Template. Map the fields of your CSV data to the fields in your data model. For the year (YR), select the Date Start field in your data model.

Now you can run your import template. You can first check a selection of the records to see if you have mapped the fields correctly. Click on Next to import the 20k data records.

Have a coffee now, you have already achieved a lot today.

After the import go to ‘Data’ and click on the Geografical Visualisation.

This is how your result should look like. Zoom in and don’t forget to play with the time slider. You will also discover ships in the desert of Africa, these are errors in the data that do not include either the longitude or latitude. So the visualization also helps you to detect such errors.

Reduce the points in the Visual Settings, if you like.

You can update now your data records based on the identifiers. Create a CSV file with the data you want to continue importing. When importing, select ‘Update Existing Objects’ and choose your identifier to map the CSV data to the corresponding record in your nodegoat database.

 

11. Advanced Tutorial – Import of Open Data into nodegoat: Benjamin Franklin’s Post Office Records

Benjamin West: Benjamin Franklin Drawing Electricity from the Sky

Benjamin West (1738-1820): Benjamin Franklin Drawing Electricity from the Sky, Philadelphia Museum of Art, image Source: Public domain, via Wikimedia Commons

In 1743 the American Philosophical Society (APS) – the oldest learned society in the US – was founded in Philadelphia by Benjamin Franklin, John Bartram, Francis Hopkinson and others for the purpose of “promoting useful knowledge”, as you can read on tha APS website. Today, APS follows an open data strategy that encourages researchers to use and reuse the open datasets provided under a Creative Commons Attribution 4.0 License. In this tutorial, we will work with such a dataset as it is a very useful resource to show what you can do with it in nodegoat, focusing on importing and mapping the data. Citation of the dataset: Heider, Cynthia, Bayard Miller and Scott Ziegler. Post Office Book, 1748-1752. BF85f6-8. Distributed by Philadelphia: American Philosophical Society Library & Museum, 2017. https://diglib.amphilsoc.org/islandora/object/compound:11.

Franklin was not only the founder of the society,  he also became Postmaster of Philadelphia in 1737, appointed by the British Crown Post. For a later time when he was serving as a Postmaster, records of letters are still available, which are archived at the APS library and made available as Open Data, as we read on the APS website:

“Benjamin Franklin’s Post Office Records: Post Office Book, Philadelphia incoming and outgoing mail, 1748-1752. Created while Benjamin Franklin served as Postmaster of Philadelphia, these datasets reveal a wealth of previously untapped information about colonial correspondence.” (Source: APS website).

In the following tutorial, we will import into nodegoat only the outgoing letters from the post office in Philadelphia. For the tutorial there are three videos which show from scratch how the data for the outgoing letters are imported and visualised in nodegoat. The videos have no sound and comments, just observe what to do. Requirements: you need a nodegoat account and Google Tables (or Excell, but the step by step tutorial uses Google Tables).

The challenge in Video 1 is to reformat the dates for import into nodegoat and to convert the American date format into the European one. An additional location (Philadelphia Post Office) is added for visualisation. The tutorial starts with the data download from the APS website:
https://diglib.amphilsoc.org/data

Video 1:

https://drive.google.com/file/d/19xyQkCuq33qI_aPk9hwkofA3N4zcGM1p/view?usp=sharing

In video 2, a data model is first created in nodegoat, based on the column names of the data in Google Tables. Then the data is downloaded as CSV from Google Tables and uploaded to nodegoat,  the data fields of the CSV file mapped with the data fields in nodegoat and the import process is started. During the import process nodegoat automatically makes assignments to the locations in the CSV data which can be accepted or rejected.

Video 2:

https://drive.google.com/file/d/16HTT5wVA-JVNa6W_WoZxcN71TtbdlvRy/view?usp=sharing

Video 3 shows how to work with the imported data. First a basic function is shown: if you click just on one object in nodegoat (a window with the object opens) only this object is visualised on the map (by clicking on the globe symbol). If you call up all objects, for example by selecting ‘all’ to the right of the filter symbol (screenshot below) or by scrolling between the pages (1,2,3 etc), all objects are visualised:

If you do this with our data, as in the video, by clicking on the globe symbol, the locations that we have previously assigned during the import process will be displayed on the map. Not all of these locations are correctly located, because during the import process we did not check if the location is really the right one, but simply chose a location to do the data cleansing afterwards in nodegoat. It would have been better if we had already cleaned the data in Google Tables or Excell BEFORE the data import, which is highly recommended! But no worries: you can also clean up the data within nodegoat, as an example shows in the video. The video also shows how to add and use a database field ‘Comment’ to the data model, for example if you want to indicate that you are unsure about the location of a place. And the video shows how to create a filter query to find locations that do not have geo-coordinates.

Video 3:

https://drive.google.com/file/d/1DDCFXXYKpkhey_rGNmmOmnJk1qIjDGhy/view?usp=sharing