• Skip to primary navigation
  • Skip to content
  • Skip to footer
ML CATCHMENT ML CATCHMENT
  • About
  • DevNotes
    yuc

    yuc

    When I’m not doing gradient descent, you might find me doing canyon descent 🏊‍♀️ or rock ascent ⛰️🧗🏼‍♀️.

    • yu.c@nycu.edu.tw
    • Github
    • Gist
    • StackOverflow

    Anywhere Q-caches

    Remember when we tried Q-caches in Temporal Transformer and it was a success?

    Well, there were some pretty magical results during experiments when I applied Q-caches to the Spatial Transformer’s second attention module.

    Q-caches at Spatio Transformer’s 2nd attention module

    Updated: September 1, 2024

    Twitter Facebook LinkedIn

    Comments

    You May Also Enjoy

    Setting up vscode debugger for pytest

    Pretty Print and Search JSON in Your Terminal

    Removing DDPM Random Noise

    Last_modified at: September 03, 2024  Mess-around Series 

    Extending the Latent Uniformly

    Last_modified at: August 27, 2024  FIFO-Diffusion Series 

    • Feed
    © 2024 ML CATCHMENT. Powered by Jekyll & Minimal Mistakes.