Code for Context based Approach for Second Language Acquisition

nihalnayak, updated 🕥 2023-02-10 23:07:48

Context based Approach for Second Language Acquisition

This project is the implementation of the system submitted to the SLAM 2018 (Second Language Acquisition Modeling 2018) shared task.

This page gives instructions for replicating the results in our system.

Table of Contents

Installation

Our project is built on python. We have ensured python 2 and 3 compatibility. In this section, we describe the installation procedure for Ubuntu 16.04.

shell git clone https://github.com/iampuntre/slam18.git cd slam18 virtualenv env source env/bin/activate pip install -r requirements.txt mkdir data

Note: Follow equivalent instructions for your Operating System

Downloading Data

In our experiments, we use the SLAM 2018 Dataset. To download the dataset, download from https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/8SWHNO.

After downloading the data, unzip it in the data directory.

Parameters for the Experiment

In order to train the model, you will have to configure the parameters.ini file. You can find this file in src/parameters.ini.

We have three sections in the file - model, options and context_features. The model section is used to point to the train and test files. options section is used to manipulate the various hyperparameters while training the model. Lastly, we have the context_features section, which is used to activate/deactivate various context based features.

Change the appropriate values for train, dev and test files. We have preset the values of the hyperparameters that we have used in our experiments. By default, all the context features are activated.

Prepare Data

After you have successfully configured the parameters.ini file, you should prepare the data for training. This is an intermediate step, where we extract the tokens and part of speech present in the surrounding of the context. For more details, read our paper.

To prepare the data, execute the following -

shell python src/prepare_data.py --params_file src/parameters.ini

You should be able to see three .json files in your data directory.

Train your model

To train the model, type the following command in your terminal -

shell python src/train_model.py --params_file src/parameters.ini

Note: It is recommended you run this step only if you have sufficient memory (atleast 16GB)

Test your predictions

To evaluate your predictions, execute the following command -

shell python src/eval.py --params_file src/parameters.ini

Citation

If you make use of this work in your paper, please cite our paper

``` @InProceedings{nayak-rao:2018:W18-05, author = {Nayak, Nihal V. and Rao, Arjun R.}, title = {Context Based Approach for Second Language Acquisition}, booktitle = {Proceedings of the Thirteenth Workshop on Innovative Use of NLP for Building Educational Applications}, month = {June}, year = {2018}, address = {New Orleans, Louisiana}, publisher = {Association for Computational Linguistics}, pages = {212--216}, url = {http://www.aclweb.org/anthology/W18-0524} }

```

Issues

Bump ipython from 7.16.3 to 8.10.0

opened on 2023-02-10 23:07:48 by dependabot[bot]

Bumps ipython from 7.16.3 to 8.10.0.

Release notes

Sourced from ipython's releases.

See https://pypi.org/project/ipython/

We do not use GitHub release anymore. Please see PyPI https://pypi.org/project/ipython/

Commits
  • 15ea1ed release 8.10.0
  • 560ad10 DOC: Update what's new for 8.10 (#13939)
  • 7557ade DOC: Update what's new for 8.10
  • 385d693 Merge pull request from GHSA-29gw-9793-fvw7
  • e548ee2 Swallow potential exceptions from showtraceback() (#13934)
  • 0694b08 MAINT: mock slowest test. (#13885)
  • 8655912 MAINT: mock slowest test.
  • a011765 Isolate the attack tests with setUp and tearDown methods
  • c7a9470 Add some regression tests for this change
  • fd34cf5 Swallow potential exceptions from showtraceback()
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/nihalnayak/slam18/network/alerts).

natural-language-processing machine-learning machine-learning-models language-learning nlp duolingo cognitive-science