We built JarvisCloud with 2 purposes in mind. We wanted the platform to be simple and affordable for DL and AI practitioners. With JarvisCloud you get instant access to a GPU-powered Jupyter notebook with your favorite framework and can also ssh to it, all in one click.
Create an account at cloud.jarvislabs.ai
Add payment information in Billing section
Recharge wallet to add funds using recharge wallet
Select machine type here and click
Start, Pause, or Delete with the buttons
Note: Paused instances are only charged for storage.
Sign up here on cloud.jarvislabs.ai.
To avoid any unnecessary billing JarvisCloud follows a prepaid system. You buy credits using a credit card. JarvisCloud deducts money from the credits added to the JarvisCloud wallet for actions like create, pause, and resume instances.
JarvisCloud uses Stripe for managing all the credit card transactions. All the data shared to Jarvis Cloud related to your card is sent to Stripe. JarvisCloud does not store any information related to your credit card. You can add card details in the Billing section.
Once the card is added successfully, you can recharge for a predefined amount of $10, $20, $50, or $100. If the recharge is successful, then JarvisCloud adds the same amount as credits in your wallet.
Once you are logged in, you can choose different configurations like framework, number of GPUs, and the storage you may need. Once you have confirmed the chosen configurations, press the launch button. Your instance should be created in less than 10 seconds. We create an instance that is optimized for the framework of choice. We also install a variety of commonly used libraries like git, wget, GCC, Pandas, NumPy, scikit-learn, matplotlib, etc.
Launch only when you are ready to use, as the instances are billed per minute.
You should be able to see a running instance like this under create instance section.
There are 4 critical actions that you can do while running instances.
You can access Jupyter lab and also connect to the instance through the command line(Optional).
Pause the instance after using it to save the state of the instance for later use. (all the installations, notebooks, and data downloaded/created will be preserved until you resume/delete the instance.
Resume paused instance to start from where you left.
Destroy it completely when work is completed to stop the billing cycle.
Clicking the green color play button on the running instance opens a Jupyter Lab. If you like to work with Jupyter notebook, then change the word lab to tree in the URL.
You can also easily access the instance through SSH. To do that, we would need you to update your public ssh key in the API Keys section.
If you have created your ssh_keys earlier you can access it by using the below command in a local terminal / PowerShell.
If you have never created an ssh key pair then you can create one as explained in the next section.
Run the following command in a local terminal / PowerShell to generate an SSH key pair:
ssh-keygen -t rsa -b 4096
Every launched machine will have SSH command associated with it once you have added your SSH key as shown in the previous step. To Copy the command specific to your machine right-click on the running instance and go to Get URLs, click on the first option to copy the SSH string. Your SSH string will be copied to the clipboard.
Now you can simply execute the SSH command from your terminal and start training your models.
You can also access and run your deep learning programs straight from Visual Studio code in 3 easy steps. To do that, you should have updated your public_ssh keys as mentioned earlier.
Install Remote - SSH Extension.
Add New SSH Host.
Connect to Host.
Install the Remote-SSH Extension from the Visual Studio code extenstions.
⇧⌘P / Ctrl+Shift+P to open Command Palette
You can also connect to the instance through Jetbrains DataSpell.
Below is small demonstration from our user
You can open up TensorBoard to visualize your model's loss and metrics over time. To open up TensorBoard right click on the running instance card as shown below.
After running a few experiments, if you want to pause the instance for later use then press the blue color pause button. You will be charged only for the storage during the paused duration.
All the paused instances are listed below the running instances.
This feature is useful when you plan to return to use the instance in a few hours or a few days. If you plan not to use the instance further then press the red trash button to destroy the instance. It will prevent further charges for storage.
You can hit the green play button on the paused instance to resume your work. A new instance is created with all your previous data.
You can click on the setting icon button on the pause card to modify your instance parameters before resuming the instance.
Once you have chosen the parameters, you can click on the play button to resume your instance.
If you have completed your work and no longer need the instance, then press the red color trash button which will delete/destroy the entire instance. Your billing will be stopped. This is an irrecoverable event, so ensure you have backed up any important data before destroying the instance.
You can delete both running and paused instances.
Note: Ensure you destroy the instance to avoid any unnecessary billings
You can choose to pause your instance from your Jupyter notebook or VScode by calling the below function
from jarviscloud import jarviscloud jarviscloud.pause()
Once an instance is created, we would need data to train our deep learning models. In this post, we will cover some of the popular sources from which you can download data. As we come across other popular data sources, we will update the post.
In the last few years, Kaggle has become a great source for a variety of data. Kaggle provides a command-line tool, which helps us to do different activities like
Make your Kaggle submissions
Since it is a very important command-line tool, we install it on all the instances. To check if it is installed, connect to the terminal of your Jarvis instance and type Kaggle in your terminal.
To use the Kaggle API, we need Kaggle API Token, which can be downloaded from the Kaggle Accounts section page.
Now let's upload the kaggle.json to our remote instance and move it to ~/.kaggle/kaggle.json and check again how the Kaggle command line responds.
Now since the configuration part is done, let's grab a dataset from a Kaggle competition.
Often times we would need small files on our instance. Assuming it is on your local machine, we can use Jupyter lab/notebook to upload. If the file is larger than a few megabytes then avoid this approach.
If you have a large file greater than 10MB, you can use a tool called Secure Copy (SCP) to transfer files to the remote machine. You can also use the tool for copying data from remote instances back to your local machine. To use SCP, do not forget to add your public SSH keys in the API Keys section. If you are not sure how to do that, then read Accessing the instance through SSH section.
scp -P 8961 ~/Documents/some_images.zip firstname.lastname@example.org:/home/
If your id_rsa file is not in its default location, then you need to give a path of it using -i option.
scp -i ./id_rsa -P 10960 /Users/akshath/Downloads/cat.jpeg email@example.com:/home/
You can Bring Your Own Container(BYOC) on Jarvislabs.ai.
To launch an instance from public image follow below steps.
Once the instance is launched, right-click on the instance to get the SSH string and connect to the instance. The ssh string would look like this
ssh -p 8960 firstname.lastname@example.org -L 9960:localhost:8888
Install jupyterlab in the newly created instance.
pip install jupyterlab
jupyter lab --allow-root
Copy the token from the terminal, which we will need to access. Open a browser, and enter the below URL
Copy the port number from the ssh URL and replace the port number above. Use the token from the terminal to start using jupyter lab. The feature is in beta, in case you face any challenges, please reach out to us. We will be happy to adapt and make the process smoother.
Your storage disk might be full. To delete unwanted files use the below command sudo rm-rf <filepath/filename>
Note: If you need any more assistance please drop an email to email@example.com or ping us on the Chat window.