Skip to content

Update docs #294

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Apr 23, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ makedocs(
bib,
sitename = "GeometricFlux.jl",
format = Documenter.HTML(
assets = ["assets/flux.css"],
assets = ["assets/flux.css", "assets/favicon.ico"],
canonical = "https://fluxml.ai/GeometricFlux.jl/stable/",
analytics = "G-M61P0B2Y8E",
),
Expand All @@ -23,6 +23,8 @@ makedocs(
"Graph Passing" => "basics/passgraph.md",
"Building Layers" => "basics/layers.md",
"Subgraph" => "basics/subgraph.md",
"Neighborhood graphs" => "basics/neighborhood_graph.md",
"Random graphs" => "basics/random_graph.md",
"Batch Learning" => "basics/batch.md",
],
"Cooperate with Flux Layers" => "cooperate.md",
Expand All @@ -43,7 +45,8 @@ makedocs(
"Pooling Layers" => "manual/pool.md",
"Embeddings" => "manual/embedding.md",
"Models" => "manual/models.md",
"Linear Algebra" => "manual/linalg.md"
"Linear Algebra" => "manual/linalg.md",
"Neighborhood graphs" => "manual/neighborhood_graph.md",
],
"References" => "references.md",
]
Expand Down
Binary file added docs/src/assets/favicon.ico
Binary file not shown.
2 changes: 1 addition & 1 deletion docs/src/basics/batch.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ train_data = [(FeaturedGraph(g, nf=train_X), train_y) for _ in 1:N]
train_batch = Flux.batch(train_data)
```

It batches up `FeaturedGraph` objects into specified mini-batch. A batch is passed to a GNN model and trained/inferred one by one. It is hard for `FeaturedGraph` objects to train or infer in real batch for GPU.
It batches up [`FeaturedGraph`](@ref) objects into specified mini-batch. A batch is passed to a GNN model and trained/inferred one by one. It is hard for [`FeaturedGraph`](@ref) objects to train or infer in real batch for GPU.

## Batch Learning for Static Graph Strategy

Expand Down
2 changes: 1 addition & 1 deletion docs/src/basics/conv.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# Graph Convolutions

Graph convolution can be classified into *spectral-based graph convolution* and *spatial-based graph convolution*. Spectral-based graph convolution, such as `GCNConv` and `ChebConv`, performs operation on features of *whole* graph at one time. Spatial-based graph convolution, such as `GraphConv` and `GATConv`, performs operation on features of *local* subgraph instead. Message-passing scheme is an abstraction for spatial-based graph convolutional layers. Any spatial-based graph convolutional layer can be implemented under the framework of message-passing scheme.
Graph convolution can be classified into *spectral-based graph convolution* and *spatial-based graph convolution*. Spectral-based graph convolution, such as [`GCNConv`](@ref) and [`ChebConv`](@ref), performs operation on features of *whole* graph at one time. Spatial-based graph convolution, such as [`GraphConv`](@ref) and [`GATConv`](@ref), performs operation on features of *local* subgraph instead. Message-passing scheme is an abstraction for spatial-based graph convolutional layers. Any spatial-based graph convolutional layer can be implemented under the framework of message-passing scheme.
16 changes: 8 additions & 8 deletions docs/src/basics/layers.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,10 @@ model = Chain(

In the example above, the feature dimension in first layer is mapped from `feat` to `h1`. In second layer, `h1` is then mapped to `h2`. Default activation function is given as `identity` if it is not specified by users.

The initialization function `GCNConv(...)` constructs a `GCNConv` layer. For most of the layer types in GeometricFlux, a layer can be initialized in two ways:
The initialization function `GCNConv(...)` constructs a [`GCNConv`](@ref) layer. For most of the layer types in GeometricFlux, a layer can be initialized in two ways:

* GNN layer without graph: initializing *without* a predefined graph topology. This allows the layer to accept different graph topology.
* GNN layer with static graph: initializing *with* a predefined graph topology, e.g. graph wrapped in `FeaturedGraph`. This strategy is suitable for datasets where each input requires the same graph structure and it has better performance than variable graph strategy.
* GNN layer with static graph: initializing *with* a predefined graph topology, e.g. graph wrapped in [`FeaturedGraph`](@ref). This strategy is suitable for datasets where each input requires the same graph structure and it has better performance than variable graph strategy.

The example above demonstrate the variable graph strategy. The equivalent GNN architecture but with static graph strategy is shown as following:

Expand All @@ -34,13 +34,13 @@ GeometricFlux.WithGraph
When using GNN layers, the general guidelines are:

* With static graph strategy: you should pass in a ``d \times n \times batch`` matrix for node features, and the layer maps node features ``\mathbb{R}^d \rightarrow \mathbb{R}^k`` then the output will be in matrix with dimensions ``k \times n \times batch``. The same ostensibly goes for edge features but as of now no layer type supports outputting new edge features.
* With variable graph strategy: you should pass in a `FeaturedGraph`, the output will be also be a `FeaturedGraph` with modified node (and/or edge) features. Add `node_feature` as the following entry in the Flux chain (or simply call `node_feature()` on the output) if you wish to subsequently convert them to matrix form.
* With variable graph strategy: you should pass in a [`FeaturedGraph`](@ref), the output will be also be a [`FeaturedGraph`](@ref) with modified node (and/or edge) features. Add `node_feature` as the following entry in the Flux chain (or simply call `node_feature()` on the output) if you wish to subsequently convert them to matrix form.

## Define Your Own GNN Layer

Customizing your own GNN layers are the same as defining a layer in Flux. You may want to check [Flux documentation](https://fluxml.ai/Flux.jl/stable/models/basics/#Building-Layers-1) first.

To define a customized GNN layer, for example, we take a simple `GCNConv` layer as example here.
To define a customized GNN layer, for example, we take a simple [`GCNConv`](@ref) layer as example here.

```julia
struct GCNConv <: AbstractGraphLayer
Expand All @@ -52,13 +52,13 @@ end
@functor GCNConv
```

We first should define a `GCNConv` type and let it be the subtype of `AbstractGraphLayer`. In this type, it holds parameters that a layer operate on. Don't forget to add `@functor` macro to `GCNConv` type.
We first should define a [`GCNConv`](@ref) type and let it be the subtype of [`AbstractGraphLayer`](@ref). In this type, it holds parameters that a layer operate on. Don't forget to add `@functor` macro to [`GCNConv`](@ref) type.

```julia
(l::GCNConv)(Ã::AbstractMatrix, x::AbstractMatrix) = l.σ.(l.weight * x * Ã .+ l.bias)
```

Then, we can define the operation for `GCNConv` layer.
Then, we can define the operation for [`GCNConv`](@ref) layer.

```julia
function (l::GCNConv)(fg::AbstractFeaturedGraph)
Expand All @@ -70,15 +70,15 @@ function (l::GCNConv)(fg::AbstractFeaturedGraph)
end
```

Here comes to the GNN-specific behaviors. A GNN layer should accept object of subtype of `AbstractFeaturedGraph` to support variable graph strategy. A variable graph strategy should fetch node/edge/global features from `fg` and transform graph in `fg` into required form for layer operation, e.g. `GCNConv` layer needs a normalized adjacency matrix with self loop. Then, normalized adjacency matrix `Ã` and node features `nf` are pass through `GCNConv` layer `l(Ã, nf)` to give a new node feature. Finally, a `ConcreteFeaturedGraph` wrap graph in `fg` and new node features into a new object of subtype of `AbstractFeaturedGraph`.
Here comes to the GNN-specific behaviors. A GNN layer should accept object of subtype of `AbstractFeaturedGraph` to support variable graph strategy. A variable graph strategy should fetch node/edge/global features from `fg` and transform graph in `fg` into required form for layer operation, e.g. [`GCNConv`](@ref) layer needs a normalized adjacency matrix with self loop. Then, normalized adjacency matrix `Ã` and node features `nf` are pass through [`GCNConv`](@ref) layer `l(Ã, nf)` to give a new node feature. Finally, a [`ConcreteFeaturedGraph`](@ref) wrap graph in `fg` and new node features into a new object of subtype of `AbstractFeaturedGraph`.

```julia
layer = GCNConv(10=>5, relu)
new_fg = layer(fg)
gradient(() -> sum(node_feature(layer(fg))), Flux.params(layer))
```

Now we complete a simple version of `GCNConv` layer. One can test the forward pass and gradient if they work properly.
Now we complete a simple version of [`GCNConv`](@ref) layer. One can test the forward pass and gradient if they work properly.

```@docs
GeometricFlux.AbstractGraphLayer
Expand Down
25 changes: 25 additions & 0 deletions docs/src/basics/neighborhood_graph.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Neighborhood Graphs

In machine learning, it is often that using a neighborhood graph to approach manifold in high dimensional space. The construction of neighborhood graph is the essential step for machine learning algorithms on graph/manifold, especially manifold learning.

The k-nearest neighbor (kNN) method is the most frequent use to construct a neighborhood graph. We provide [`kneighbors_graph`](@ref) to generate a kNN graph from a set of nodes/points.

We prepare 1,024 10-dimensional data points.

```julia
X = rand(Float32, 10, 1024)
```

Then, we can generate a kNN graph with `k=7`, which means a data point should be linked to their top-7 nearest neighbor points.

```julia
fg = kneighbors_graph(nf, 7)
```

The default distance metric would be `Euclidean` distance from Distance.jl package. If one wants to customize [`kneighbors_graph`](@ref) by using different distance metric, you can just use the distance objects from Distance.jl package directly, and pass it to [`kneighbors_graph`](@ref).

```julia
using Distances

fg = kneighbors_graph(nf, 7, Cityblock())
```
18 changes: 9 additions & 9 deletions docs/src/basics/passgraph.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,21 @@
# Graph Passing Strategy

Graph is an input data structure for graph neural network. Passing a graph into a GNN layer can have different behaviors. If the graph remains fixed across samples, that is, all samples utilize the same graph structure, a static graph is used. Or, graphs can be carried within `FeaturedGraph` to provide variable graphs to GNN layer. Users have the flexibility to pick an adequate approach for their own needs.
Graph is an input data structure for graph neural network. Passing a graph into a GNN layer can have different behaviors. If the graph remains fixed across samples, that is, all samples utilize the same graph structure, a static graph is used. Or, graphs can be carried within [`FeaturedGraph`](@ref) to provide variable graphs to GNN layer. Users have the flexibility to pick an adequate approach for their own needs.

## Variable Graph Strategy

Variable graphs are supported through `FeaturedGraph`, which contains both the graph information and the features. Each `FeaturedGraph` can contain a distinct graph structure and its features. Data of `FeaturedGraph` are fed directly to graph convolutional layer or graph neural network to let each feature be learned on different graph structures. A adjacency matrix `adj_mat` is given to construct a `FeaturedGraph` as follows:
Variable graphs are supported through [`FeaturedGraph`](@ref), which contains both the graph information and the features. Each [`FeaturedGraph`](@ref) can contain a distinct graph structure and its features. Data of [`FeaturedGraph`](@ref) are fed directly to graph convolutional layer or graph neural network to let each feature be learned on different graph structures. A adjacency matrix `adj_mat` is given to construct a [`FeaturedGraph`](@ref) as follows:

```
fg = FeaturedGraph(adj_mat, features)
layer = GCNConv(feat=>h1, relu)
```

`Simple(Di)Graph`, `SimpleWeighted(Di)Graph` or `Meta(Di)Graph` provided by the packages Graphs, SimpleWeightedGraphs and MetaGraphs, respectively, are acceptable for constructing a `FeaturedGraph`. An adjacency list is also accepted, too.
`Simple(Di)Graph`, `SimpleWeighted(Di)Graph` or `Meta(Di)Graph` provided by the packages Graphs, SimpleWeightedGraphs and MetaGraphs, respectively, are acceptable for constructing a [`FeaturedGraph`](@ref). An adjacency list is also accepted, too.

### `FeaturedGraph` in, `FeaturedGraph` out
### [`FeaturedGraph`](@ref) in, [`FeaturedGraph`](@ref) out

Since a variable graph is provided from data, a `FeaturedGraph` object or a set of `FeaturedGraph` objects should be fed in a GNN model. The `FeaturedGraph` object should contain a graph and sufficient features that a GNN model needed. After operations, a `FeaturedGraph` object is given as output.
Since a variable graph is provided from data, a [`FeaturedGraph`](@ref) object or a set of [`FeaturedGraph`](@ref) objects should be fed in a GNN model. The [`FeaturedGraph`](@ref) object should contain a graph and sufficient features that a GNN model needed. After operations, a [`FeaturedGraph`](@ref) object is given as output.

```julia
fg = FeaturedGraph(g, nf=X)
Expand All @@ -36,11 +36,11 @@ layer = WithGraph(fg, GCNConv(feat=>h1, relu))

### Cached Graph in Layers

While a variable graph is given by `FeaturedGraph`, a GNN layer doesn't need a static graph anymore. A cache mechanism is designed to cache static graph to reduce computation time. A cached graph is retrieved from `WithGraph` layer and operation is then performed. For each time, it will assign current computed graph back to layer.
While a variable graph is given by [`FeaturedGraph`](@ref), a GNN layer doesn't need a static graph anymore. A cache mechanism is designed to cache static graph to reduce computation time. A cached graph is retrieved from [`WithGraph`](@ref) layer and operation is then performed. For each time, it will assign current computed graph back to layer.

### Array in, Array out

Since a static graph is provided from `WithGraph` layer, it doesn't accept a `FeaturedGraph` object anymore. Instead, it accepts a regular array as input, and outputs an array back.
Since a static graph is provided from [`WithGraph`](@ref) layer, it doesn't accept a [`FeaturedGraph`](@ref) object anymore. Instead, it accepts a regular array as input, and outputs an array back.

```julia
fg = FeaturedGraph(g)
Expand All @@ -50,11 +50,11 @@ H = layer(X)

## What you feed is what you get

In GeometricFlux, there are are two APIs which allow different input/output types for GNN layers. For example, `GCNConv` layer provides the following two APIs:
In GeometricFlux, there are are two APIs which allow different input/output types for GNN layers. For example, [`GCNConv`](@ref) layer provides the following two APIs:

```julia
(g::WithGraph{<:GCNConv})(X::AbstractArray) -> AbstractArray
(g::GCNConv)(fg::FeaturedGraph) -> FeaturedGraph
```

If your feed a `GCNConv` layer with a `Array`, it will return you a `Array`. If you feed a `GCNConv` layer with a `FeaturedGraph`, it will return you a `FeaturedGraph`. **These APIs ensure the consistency between input and output types.**
If your feed a [`GCNConv`](@ref) layer with a `Array`, it will return you a `Array`. If you feed a [`GCNConv`](@ref) layer with a [`FeaturedGraph`](@ref), it will return you a [`FeaturedGraph`](@ref). **These APIs ensure the consistency between input and output types.**
48 changes: 48 additions & 0 deletions docs/src/basics/random_graph.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# Random Graphs

## Random Graph Generation

A graph is needed as input for a GNN model. Random graph can be generated, which is provided by Graphs.jl package. A random graph can be generated by `erdos_renyi` model.

```julia
julia> using Graphs

julia> g = erdos_renyi(10, 30)
{10, 30} undirected simple Int64 graph
```

To construct a `FeaturedGraph` object, just put the graph object and its corresponding features into it.

```julia
julia> X = rand(Float32, 5, 10);

julia> fg = FeaturedGraph(g, nf=X)
FeaturedGraph:
Undirected graph with (#V=10, #E=30) in adjacency matrix
Node feature: ℝ^5 <Matrix{Float32}>
```

Various random graph with different generating model can be used here.

```julia
julia> barabasi_albert(10, 3)
{10, 21} undirected simple Int64 graph

julia> watts_strogatz(10, 4, 0.3)
{10, 20} undirected simple Int64 graph
```

`barabasi_albert` generates graphs from scale-free network model, while `watts_strogatz` generates graphs from small-world model.


## Common Graphs

There are commonly used graphs listed here.

```julia
clique_graph(k, n)
complete_graph(n)
grid(dims; periodic=false)
path_digraph(n)
path_graph(n)
```
6 changes: 3 additions & 3 deletions docs/src/basics/subgraph.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
# Subgraph

## Subgraph of `FeaturedGraph`
## Subgraph of [`FeaturedGraph`](@ref)

A `FeaturedGraph` object can derive a subgraph from a selected subset of the vertices of the graph.
A [`FeaturedGraph`](@ref) object can derive a subgraph from a selected subset of the vertices of the graph.

```julia
train_idx = train_indices(Planetoid(), :cora)
fg = FeaturedGraph(g)
fsg = subgraph(fg, train_idx)
```

A `FeaturedSubgraph` object is returned from `subgraph` by selected vertices `train_idx`.
A `FeaturedSubgraph` object is returned from [`subgraph`](@ref) by selected vertices `train_idx`.
10 changes: 5 additions & 5 deletions docs/src/cooperate.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# Cooperate with Flux Layers

GeometricFlux is designed to be compatible with Flux layers. Flux layers usually have array input and array output. Since the mechanism of "what you feed is what you get", the API for array type is compatible directly with other Flux layers. However, the API for `FeaturedGraph` is not compatible directly.
GeometricFlux is designed to be compatible with Flux layers. Flux layers usually have array input and array output. Since the mechanism of "what you feed is what you get", the API for array type is compatible directly with other Flux layers. However, the API for [`FeaturedGraph`](@ref) is not compatible directly.

## Fetching Features from `FeaturedGraph` and Output Compatible Result with Flux Layers
## Fetching Features from [`FeaturedGraph`](@ref) and Output Compatible Result with Flux Layers

With a layer outputs a `FeaturedGraph`, it is not compatible with Flux layers. Since Flux layers need single feature in array form as input, node features, edge features and global features can be selected by using `FeaturedGraph` APIs: `node_feature`, `edge_feature` or `global_feature`, respectively.
With a layer outputs a [`FeaturedGraph`](@ref), it is not compatible with Flux layers. Since Flux layers need single feature in array form as input, node features, edge features and global features can be selected by using [`FeaturedGraph`](@ref) APIs: [`node_feature`](@ref), [`edge_feature`](@ref) or [`global_feature`](@ref), respectively.

```julia
model = Chain(
Expand All @@ -26,7 +26,7 @@ model = Chain(

## Branching Different Features Through Different Layers

A `GraphParallel` construct is designed for passing each feature through different layers from a `FeaturedGraph`. An example is given as follow:
A [`GraphParallel`](@ref) construct is designed for passing each feature through different layers from a [`FeaturedGraph`](@ref). An example is given as follow:

```julia
Flux.Chain(
Expand All @@ -40,7 +40,7 @@ Flux.Chain(
)
```

`GraphParallel` will pass node feature to a `Dropout` layer and edge feature to a `Dense` layer. Meanwhile, a `FeaturedGraph` is decomposed and keep the graph in `FeaturedGraph` to the downstream layers. A new `FeaturedGraph` is constructed with processed node feature, edge feature and global feature. `GraphParallel` acts as a layer which accepts a `FeaturedGraph` and output a `FeaturedGraph`. Thus, it by pass the graph in a `FeaturedGraph` but pass different features to different layers.
[`GraphParallel`](@ref) will pass node feature to a `Dropout` layer and edge feature to a `Dense` layer. Meanwhile, a [`FeaturedGraph`](@ref) is decomposed and keep the graph in [`FeaturedGraph`](@ref) to the downstream layers. A new [`FeaturedGraph`](@ref) is constructed with processed node feature, edge feature and global feature. [`GraphParallel`](@ref) acts as a layer which accepts a [`FeaturedGraph`](@ref) and output a [`FeaturedGraph`](@ref). Thus, it by pass the graph in a [`FeaturedGraph`](@ref) but pass different features to different layers.

```@docs
GeometricFlux.GraphParallel
Expand Down
Loading