Graph Neural Networks Variations And Applications.


Graph Neural Networks Variations And Applications.

Hello and a warm welcome to yet another crunchy blog, and today we would be talking about a more specific topic, rather than exploring the entire topic, and the topic is Graph Neural Network. Remember Neural Network itself is an ocean, and we are not diving into it directly. So, fasten your seat to explore the in-depth of Graph Neural Network.

What is Graph Neural Network?

So, here I would be talking a little bit about Graph Neural Networks and an overview of what's been going on, and also would give a very gentle introduction of the idea of the graph neural network, and the history behind it, and a couple of applications. So, what is the problem that we're interested in? Suppose you have a dataset that contains many graphs, and each of these graphs has some sort of label, maybe a real value, or a classification, or something more sophisticated, and you want to design some sort of machine learning function that can take in a graph as an input, and return you the label as the output.

Graph Neural Networks Variations And Applications.
So, to give you a concrete example, maybe you have some molecules that truly represent graphs, and these molecules will have some properties, maybe an interesting property would be, how likely that molecule is a drug? And you want to build a function that's able to take new molecules and tell you whether they are likely to be a struggle. How are we going to approach this, the starting point is going to be recurrent?

Recovery neural networks

Neural networks are a good place to start, and very successful. Recovery neural networks basically operate on a very special type of graph, the chain graph. So, if you have some text, then it's a sequence of tokens, all connected to each other by these links. And we represent chain graphs using recurrent units, we replace each node with the recurring unit which is represented with triangles, and we link them together with arrows. And then we proceed by embedding each token in sequence.

So, there's some sort of words, stored in these nodes and we embed them, and we’re going to represent the embeddings as an envelope to sort of invoking, the idea of a message. And each node gets these envelopes and then we do our usual thing of just running recurrent. Neural network forwards using some sort of recurrence relation which given, current state and the New Message gives us a new state. So, hopefully, this is all very familiar, and really the purpose of this slide was to introduce this graphical notation. Because what we're going to do now is, we're going to go to graph structure data so we start adding edges to the chain, then we get a general graph. And the important thing about graphs is you can represent them as a nice adjacency matrix.

Graph Neural Networks Variations And Applications.
But there's a fundamental symmetry that graphs have that you know you can draw them in many different ways, and they all represent the same thing. You can commute the order of the vertices, and you get a different adjacency matrix. But fundamentally, all of these representations on the slide represent exactly the same graph. And so, any model that we construct for analyzing graphs has to be invariant under these symmetries. So, how are we going to adapt the RNN frame graphs?

RNN frame graphs

So, here's a graph that is provocatively shaped like a molecule, but it could be any graph. And we're going to start just as we did before, by taking some features of the nodes, so maybe this node is a carbon atom and a feature vector is just, it's the atom hydrogen, is the atom carbon, one in that slot is the atom of fluorine whenever we put zeros everywhere else. So, we can put some features onto the nodes and we store them in the state of the nodes and that's it, a gray envelope. And then, we will do that for all the nodes, they all get their features, and then we associate a neural network with every edge of a specific type. So, maybe we have single bonds represented by these green edges and double bonds represented by yellow edges, but in general, you might have many different edge types of like that.

Knowledge of the base would have different types of relationships and so on. So, that's the basic idea and then we're going to replace all of the nodes with recurrent units, these triangles. And the message passing is going to proceed as follows. Imagine you zoom in on this particular node, that node will pull all of the messages from his neighbors, and as the messages are pulled, they will go through the neural networks on the edges they have to pass over. And all nodes at time T will pull the messages from time T-1 from their neighbors. So, all nodes are pulling simultaneously from the previous timestep. Once we have collected all the messages, we perform a sum, and this sum is invariant to the order of the neighboring messages so you can commute the envelopes inside that sum and the sum hasn't changed.

Is Neural Networks Bidirectional?

Usually, neural networks map from inputs to outputs. The neural networks simply map the messages as vectors and it matches a vector of the same size. What happens if you send the messages the other way? Is there a separate neural network?

So, the edges are all directed, and if you want an undirected edge, you can add the same edge backward and then share the network on both edges. Okay, so that was a single timestep. So, we just pull messages from our first order neighbors. So, you can think about that after this single timestep, each node basically knows about its own information and the information from nodes of distance one away. And then we just repeat this over and over again.

So, after the second time step, each node knows about its first order and second order neighbors, and we can just keep going and we stop after a fixed number of timesteps, and that fixed number in our particular variety of graph neural network, is a hyperparameter. So, you decide how many steps you're going to propagate. So, sort of the radius of a fuel smooshing around the information, before you start. And once you've finished going round, and round, and round, you have representations stored on all the nodes that somehow have collected information from the local environment of the graph, and then you just collect them all up and perform a sum, and again this sum is permutation variant, and then you have a representation of your graph which you're going to feed to whatever higher layers that you want to perform your action on.

Conclusion

So, this was all about Graph Neural Networks in the simplest way possible, I hope you enjoyed reading it and boosted to increase your knowledge. I tried to keep it less technical and more informative, rather than showing you the implementations of the Tech. On this note, I would like to wrap up this article and stay updated, as I bring you the latest tech knowledge in the simplest way possible.

Graph Neural Networks Variations And Applications. Graph Neural Networks Variations And Applications. Reviewed by Abhishek Yadav on July 11, 2020 Rating: 5

No comments:

Please let me know if you liked the post. Do share it with your friends

Powered by Blogger.