• Skip to primary navigation
  • Skip to content
  • Skip to footer
ML CATCHMENT ML CATCHMENT
  • About
  • DevNotes
    yuc

    yuc

    When I’m not doing gradient descent, you might find me doing canyon descent 🏊‍♀️ or rock ascent ⛰️🧗🏼‍♀️.

    • yu.c@nycu.edu.tw
    • Github
    • Gist
    • StackOverflow

    Anywhere Q-caches

    Remember when we tried Q-caches in Temporal Transformer and it was a success?

    Well, there were some pretty magical results during experiments when I applied Q-caches to the Spatial Transformer’s second attention module.

    Q-caches at Spatio Transformer’s 2nd attention module

    Updated: September 1, 2024

    Twitter Facebook LinkedIn

    Comments

    You May Also Enjoy

    Open3D/Open3D-ML’s Compilation Error-“Compatibility with CMake < 3.5”

    Open3D/Open3D-ML’s configuration Error-“Failed to clone repository”

    Compiling ffmpeg with cuda and shared library

    Setting up vscode debugger for pytest

    • Feed
    © 2025 ML CATCHMENT. Powered by Jekyll & Minimal Mistakes.