Skip to Main Content

Digital Humanities

Using digital tools to research humanities questions

Digital Humanities Resources, Methods and Tools

Here are some general-purpose resources for developing digital humanities projects.

For resources and tools related to specific DH methodologies, please explore the other tabs in this box!

Best Practices for Digital Humanities Projects

  • A brief guide from the Center for Digital Research in the Humanities at the University of Nebraska - Lincoln

Development for the Digital Humanities

  • A resource for developing and managing digital humanities projects. Includes sections on formulating research questions, defining projects, making a plan for data, and many other crucial features of a DH project.

Digital History

  • This book provides a plainspoken and thorough introduction to the web for historians who wish to produce online historical work, or to build upon and improve the projects they have already started in this important new medium.

Digital Research Tools (DiRT)

  • Archived version (2019) of the DiRT Directory is a registry of digital research tools for scholarly use.

Introduction to Digital Humanities Course Book

  • Adapted from DH 101 UCLA course

Programming Historian

  • Peer-reviewed, hands-on workshop/tutorials

Visualizing Objects, Places, and Spaces: A Digital Project Handbook

  • A guide to the essential steps needed to plan a digital project. You can learn more about various project types, or look at case studies and assignments to see what others have done. You can also submit your own work for inclusion.

 

Text encoding and analysie involves using computational tools to analyze large amounts of text, such as books, articles and manuscripts, to uncover patterns and connections that would be difficult to find by reading them manually.

Tools and resources for text encoding and analysis:

  • Text Encoding Initiative (TEI) Consortium     
    • Provides guidelines for structuring texts for digital analysis with project example
  • Natural Language Processing with Python     
    • This freely available book provides a practical introduction to natural language processing (NLP) using the Python programming language, and it includes many examples of how NLP can be applied to textual data in the humanities 
  • Voyant Tools     
    • A web-based text-analysis tool  
  • WordSmith     
    • A tool that allows users to develop concordances, find keywords, and develop word lists from plain text files  
  • Constellate     
    • Currently (2022), UCLA researchers have access to the free platform. Constellate allows you to build collections of content from multiple platforms (JStor, Portico, Chronicling America) as well as learn, teach, and perform text analysis (Constellate tutorial list)  
  • OpenRefine     
    •  a powerful tool for working with messy data: cleaning it; transforming it from one format into another; and extending it with web services and external data.
  • Mallet     
    • Open source machine-learning toolkit. Topic models are useful for analyzing large collections of unlabeled text. The MALLET topic modeling toolkit contains efficient, sampling-based implementations of Latent Dirichlet Allocation, Pachinko Allocation, and Hierarchical LDA.
  • Hathi Trust Research Commons (HTRC) Analytics     
    • Supports large-scale computational analysis of the works in the HathiTrust Digital Library to facilitate non-profit and educational research. Related: Programming Historian Python text mining tutorial for HathiTrust Research Center’s Extracted Features dataset

   

Digital mapping and spatial analysis involves creating digital maps and spatial analyses to study the relationships between people, places, and events in the past and present.

Tools and resources for digital mapping and spatial analysis:

  • ArcGIS     
    • Geographic Information Systems (GIS) mapping software.
  • OpenStreetMap     
    • Created by a global community of contributors, OpenStreetMap is a free, editable map of the world with an emphasis on local knowledge, existing as open data that can be used for research projects (or any other purpose) with proper credit

Network analysis and visualization involves using computational tools to analyze and visualize relationships between people, ideas, and events, in order to understand how they are connected and how they have changed over time.

Tools and resources for network analysis

  • Gephi     
    • Free open source software for network analysis and visualization    
  • Cytoscape     
    • An open source software platform for visualizing complex networks

Digital preservation and archiving involves creating digital copies of historical and cultural artifacts and making them available online, with the goal of preserving them for future generations.

Tools and resources for digital preservation:

  • Omeka     
    • Omeka is a free, flexible, and open source web-publishing platform for the display of library, museum, archives, and scholarly collections and exhibitions. Its “five-minute setup” makes launching an online exhibition as easy as launching a blog. To create maps and timelines, see Neatline, a suite of add-on tools.  
  • Collection Builder     
    • Collection Builder is an open source tool for creating digital collection and exhibit websites that are driven by metadata and powered by modern static web technology
  • Drupal     
    • Drupal is an open source content management system for supporting resources like blogs and web sites
  • oXygen     
    • A cross-platform XML editor that may be used to create and validate XML documents and associated schema

Data visualization is the process of using graphical representations to show the results of data analysis, such as graphs, charts, and maps, which can help to identify patterns and trends. See the UCLA Data Visualization Research Guide for more information.

Collaborative research and annotation is the practice of multiple researchers working together using digital tools and platforms to annotate, analyze, and interpret data.

Tools and resources for collaborative research and annotation:

  • Recogito     
    • Supports semantic markup/annotation, named-entity recognition, Geo-names   
  • Scalar     
    • A semantic web authoring and publishing platform that supports various media annotations and multiple authors   
  • Hypothesis     
    • Annotate the web with anyone, anywhere

Virtual reality (VR) and 3D modeling involves using virtual reality and 3D modeling techniques to create immersive simulations of historical and cultural sites, which can be used for research and education.

Tools and resources for VR and 3D modeling:

  • Blender     
    • Free open source 3D creation software that provides tools for modeling, animation, and simulation
  • SketchUp     
    • 3D modeling software with a simple, user-friendly interface for creating 3D models and scenes 
  • Maya     
    • A professional 3D animation software developed by Autodesk for more technically advanced users which provides a comprehensive set of tools for creating complex 3D models, animations, and simulations

General Purpose Data Analysis Tools and Resources

Here are some commonly used tools for data cleaning, statistical analysis and visualization.

UCLA offers various free and discounted licenses for some software products, so make sure to check the list before paying for a program.

Python is a programming language that enables data analysis.

R is a programming language that enables data analysis.

MATLAB is a proprietary programming language and numeric computing environment.

Open Refine is a powerful tool for working with messy data: cleaning it; transforming it from one format into another; and extending it with web services and external data.

Tableau Software helps people see and understand data. Tableau allows anyone to perform sophisticated education analytics and share their findings with online dashboards.

Stata is a proprietary, general-purpose statistical software package for data manipulation, visualization, statistics, and automated reporting. It is used by researchers in many fields, including biomedicine, economics, epidemiology and sociology.

SPSS (Statistical Package for the Social Sciences) is a software package used for the analysis of statistical data. Although the name of SPSS reflects its original use in the field of social sciences, its use has since expanded into other data markets.

ArcGIS is geospatial software to view, edit, manage and analyze geographic data. It enables users to visualize spacial data and create maps.

Stackoverflow

  • A community where people can ask, answer, and search for questions related to programming

Software Carpentries

  • Lessons to build software skills, part of the larger community The Carpentries which aims to teach foundational computational and data science skills to researchers

Github

  • Cloud-based service website based on Git software that allows develops to store and manage their code, especially helpful for version control during collaboration. The Software Carpentries has a lesson on Git and Github where you can learn more

Open Data Tools

  • List of tools and resources to explore, publish, and share public datasets with sections specifically for visualization, data, source code, and information.

Data Science Notebooks

  • List of interactive computing platforms for data science, includes comparison table at the bottom of the page

Gephi

  • Open graph visualization platform, well-known as a tool for network visualization

Digital Humanities in Practice

DH Projects at UCLA:


Other DH Projects:

Electronic Journals:

Subject Specific:

Non-Peer Reviewed: