prompt_h2501

💫 VibeCoding — Class Flow (38 Students)

Goal

Reinterpret Rafael Rozendaal’s Internet (2002) through vibe coding — translating mood, rhythm, and color into generative motion using p5.js, then extending those visuals through AI video generation. Students learn to describe visuals as parameters, experiment with procedural rhythm, and compare human-coded versus AI-interpreted motion.

Setup

Each team submits:

Example folder:

/VibeCoding 2025/
   Flow_Collective/
      sketch_link.txt
      frame_first.png
      frame_last.png
      ai_video.mp4
      notes.txt
   Neon_Void/
   Tranquil_Circuit/
   ...

Timing Overview

Stage Activity Duration Notes
1 Introduce Rozendaal & Internet 10 min Discuss rhythm, repetition, minimalism.
2 p5.js Demo & Account Setup 10 min Walk through p5 login & template copy.
3 Team VibeCoding Session 30 min Directors describe, Tuners adjust parameters.
4 Frame Export 10 min Save first & last frames.
5 AI Video Generation 15 min Instructor queues pairs on Replicate.
6 Showcase & Discussion 15 min Compare human vs AI motion.

Total: ~90 min (max 2 h 30 min with setup)

Creating a p5.js Account & Saving Sketches

  1. Visit 👉 https://editor.p5js.org
  2. Click “Sign Up” (top-right) → sign in with Github, Google or email.
  3. Once logged in:

    • Click “New Sketch.”
    • Rename it with your team name (Flow_Collective).
    • Copy the instructor template code.
  4. Use Run ▶️ to preview.
  5. Use File → Save to keep your changes.
  6. To share: click Share → Copy Link and paste it in sketch_link.txt.
  7. To save useCtrl+S / Cmd+S. The editor also auto-saves every 30 seconds when logged in.
  8. To save image: right-click inside the code editor → Save Image As .

Using the Starter Template

A lightweight motion seed:

let dots = [];
let chaos = 0.01;
function setup() {
  createCanvas(600, 600);
  for (let i = 0; i < 80; i++)
    dots.push(createVector(random(width), random(height)));
}
function draw() {
  background(0, 10);
  stroke(255, 150);
  for (let p of dots) {
    p.x += sin(frameCount * chaos + p.y * chaos);
    p.y += cos(frameCount * chaos + p.x * chaos);
    point(p.x, p.y);
  }
}

Teams experiment only by changing numbers and colors (no syntax edits). Encourage verbal prompts: “slower,” “denser,” “brighter,” “chaotic.”

💡 Code Assistance Tools

Students may use one of the following language models to help refine parameters or interpret errors:

Suggested prompt for assistance:

“I have this p5.js code. How can I make the motion slower and more organic, without adding new functions?”

Exporting Frames

  1. When the sketch looks right, stop the animation (pause ▶️).
  2. Right-click inside the code editor → Save Image As to export a frame (PNG).
  3. Save one early frame (≈ 10 s) and one late frame (≈ 60 s).
  4. Upload both to your team folder.

AI Video Interpretation Phase

  1. Selects your two frames.
  2. Upload pair to Replicate using the ByteDance Seedance-1 Lite model (720p prefered).
  3. Generate a 3–5 s video showing the interpolation between frames.
  4. Optional: add audio from Day 02 PromptVision for cross-modal effect.

Showcase & Feedback

Deliverables

Each team uploads:

sketch_link.txt
frame_first.png
frame_last.png
ai_video.mp4
notes.txt

Example notes.txt:

Our sketch explores the vibe of connection and flow through oscillating points and soft motion.
The AI video interpreted it as a dense wave field, amplifying the chaos and color pulses.

✅ Outcome

Students translate Rozendaal’s web-art logic into parameterized generative motion, see how AI reimagines that vibe in video, and reflect on authorship, motion, and interpretation across media.