TY - JOUR AU - AB - Documentation ground truth Implementing Neural Network Jupyter notebook allows data scientists to our Model Implementing Neural Network write machine learning code together with its code2seq The following function of the model graph2seq After perturbations documentation in cells. In this paper, we pro- T5-small Model pose a new task of code documentation gen- Code Cells eration (CDG) for computational notebooks. import keras In contrast to the previous CDG tasks which from keras.utils import plot_model focus on generating documentation for single from keras.models import ,! Model,Sequential,load_model code snippets, in a computational notebook, ... one documentation in a markdown cell often corresponds to multiple code cells, and these def nn_model(X,y,optimizer,kernels): code cells have an inherent structure. We pro- input_shape = X.shape[1] posed a new model (HAConvGNN) that uses if(len(np.unique(y)) == 2): a hierarchical attention mechanism to consider op_neurons = 1 op_activation = 'sigmoid' the relevant code cells and the relevant code loss = 'binary_crossentropy' tokens information when generating the docu- else: op_neurons = len(np.unique(y)) mentation. Tested on a new corpus constructed op_activation = 'softmax' from well-documented Kaggle notebooks, we loss = 'categorical_crossentropy' show that our model outperforms other base- classifier = Sequential() line models. ... classifier.summary() return classifier 1 Introduction In recent TI - HAConvGNN: Hierarchical Attention Based Convolutional Graph Neural Network for Code Documentation Generation in Jupyter Notebooks JF - Findings of the Association for Computational Linguistics: EMNLP 2021 DO - 10.18653/v1/2021.findings-emnlp.381 DA - 2021-01-01 UR - https://www.deepdyve.com/lp/unpaywall/haconvgnn-hierarchical-attention-based-convolutional-graph-neural-KzCGuvcvrV DP - DeepDyve ER -