Free Notebook Usage

Free Notebook is like a “free trial version” that lets you experience Notebook features first to see if they meet your needs. Although there are some limitations, it’s completely sufficient for learning and simple projects.

What’s Included in the Free Version?

Basic Resources

Computing Resources: 2 CPU cores, processing speed is sufficient, 4GB memory, can handle medium-scale data, 10GB storage space, enough to store project files, maximum 12 hours runtime per session.

Software Environment: Python 3.9 version, pre-installed common data science packages, can install packages you need, supports multiple programming languages.

What Are the Usage Limitations?

Resource Limitations

Runtime Limitations: Can only run 1 Notebook at a time, maximum 1 hour per session, automatically stops after 30 minutes of inactivity, maximum 100 hours per month.

Storage Limitations: Workspace maximum 10GB, temporary storage maximum 2GB, single file cannot exceed 100MB, maximum 10 Notebooks can be created.

Functional Limitations

API Usage: Maximum 60 requests per minute, maximum 1GB data transfer per day, maximum 5 concurrent requests.

Collaboration Features: Maximum 3 people can view, can only share read-only versions, can add comments, only keeps the latest 5 versions.

Getting Started

Registration Process

Create Account: Visit registration page, fill in basic information, verify email, complete setup.

Environment Initialization: Use command line tools to initialize workspace, verify Python environment.

Basic Operations

Create Notebook: Use command line tools to create new Notebook, can import sample projects.

Manage Files: Supports uploading and downloading files, manage project data.

Resource Optimization

Memory Management

  1. Optimization Techniques

    import gc
    import torch
    
    def optimize_memory():
        # Clean up unused objects
        gc.collect()
    
        # Clean PyTorch cache
        torch.cuda.empty_cache()
    
        # Use generators for large data
        def data_generator():
            for chunk in pd.read_csv("large_file.csv", chunksize=1000):
                yield process_chunk(chunk)
    
  2. Data Processing

    # Process large files in chunks
    def process_large_file(file_path):
        results = []
        for chunk in pd.read_csv(file_path, chunksize=1000):
            result = process_chunk(chunk)
            results.append(result)
        return pd.concat(results)
    

Computing Optimization

  1. Parallel Processing

    from multiprocessing import Pool
    
    def parallel_process(data_list):
        with Pool(processes=2) as pool:
            results = pool.map(process_function, data_list)
        return results
    
  2. Code Optimization

    # Use vectorized operations
    import numpy as np
    
    def optimize_calculation(data):
        # Replace loop operations
        result = np.vectorize(process_function)(data)
        return result
    

Usage Tips

Improve Efficiency

Code Organization

  • Break code into different functions
  • Each function does one thing
  • Code structure should be clear
  • Easy to understand and maintain

Use Caching

  • Use cache for repeated calculations
  • Avoid running same code repeatedly
  • Save time and resources
  • Improve runtime efficiency

Storage Management

File Management

  • Regularly clean temporary files
  • Compress old data files
  • Delete unnecessary files
  • Keep workspace tidy

Data Optimization: Choose appropriate data types, compress large datasets, optimize storage format, save storage space.

When to Upgrade?

Upgrade Signals

Insufficient Resources: CPU usage frequently exceeds 80%, memory usage frequently exceeds 75%, storage space usage exceeds 90%, frequently encounter various limitations.

Functional Needs: Need GPU acceleration, need larger storage space, need longer runtime, need more collaboration features.

Upgrade Steps

Preparation: Evaluate actual needs, choose suitable paid plan, prepare data migration, configure new environment.

Data Migration: Export current workspace, migrate to new environment, verify data integrity, start using new features.

Common Questions

Resource Issues

How to check resource usage? View resource monitoring in Notebook interface, use system commands to check CPU and memory, regularly check storage space usage.

How to handle insufficient memory? Close unnecessary Notebooks, clean temporary variables and data, use smaller datasets, consider upgrading to paid version.

How to optimize runtime? Use efficient algorithms, avoid repeated calculations, reasonably use cache, optimize data processing workflows.

How to manage storage space? Regularly clean unnecessary files, compress large datasets, use external storage services, promptly delete temporary files.

Functional Issues

How to install additional packages? Use pip to install Python packages, use conda to install scientific computing packages, check package compatibility, note storage space limitations.

How to share Notebook? Set sharing permissions, generate sharing links, invite collaborators, control access permissions.

Limitation Issues

Runtime Limitations: Maximum 1 hour per session, automatically stops after 30 minutes of inactivity, reasonably arrange computing tasks, promptly save work results.

Storage Space Limitations: Workspace maximum 10GB, single file cannot exceed 100MB, regularly clean and compress, consider upgrading to paid version.

Summary

Free Notebook is a good choice to start AI learning. Through reasonable usage, you can Experience for Free (use Notebook without payment), Learn Basics (master basic programming and data processing skills), Project Practice (complete small AI projects), and Evaluate Needs (understand whether to upgrade to paid version).

Remember, although the free version has limitations, it’s completely sufficient for learning and simple projects. Plan reasonably, optimize usage, and you can fully utilize free resources!