Introduction to hierarchical clustering (Part 3 — Spatial clustering

By Medium - 2021-03-12

Description

In our attempt to cluster crimes in London in the previous article, we ignored the spatial dimension of the data in performing the clustering. Thus, this article seeks to remedy this by explicitly…

Summary

  • ) Introducing a spatial dimension into hierarchical clustering In our attempt to cluster crimes in London in the previous article, we ignored the spatial dimension of the data in performing the clustering.
  • Thus, this article seeks to remedy this by explicitly accounting for this.
  • What is good about agglomerative clustering is that we can add a connectivity constraint in the algorithm so that only adjacent clusters can be merged together.
  • It is important to note that how this algorithm performs when using the connectivity measures depends on the linkage parameter chosen, as noted in the sklearn documentation, ward’s linkage avoids the ‘rich getting richer’ problem associated with other linkage options [1], thus we hope to see relatively even clusters.

 

Topics

  1. Stock (0.12)
  2. NLP (0.1)
  3. Machine_Learning (0.06)

Similar Articles

HDBSCAN Clustering with Neo4j

By Medium - 2021-01-15

I recently came across the article “How HDBSCAN works” by Leland McInnes, and I was struck by the informative, accessible way he explained…

Self-Organising Textures

By Distill - 2021-02-27

Neural Cellular Automata learn to generate textures, exhibiting surprising properties.