Skip to content

What are the main contributions of p tuning?  #76

@2catycm

Description

@2catycm

If it is just an implementation of existing methods, which is not novel, why the conference of p tuning paper is top CCF-A and the paper is widely cited?

So I wonder what is the core difference between p tuning and prefix tuning and deep soft prompt tuning.

From my literatur review, it seems preprending K and V is not proposed in prefix tuning, but many papers wrongly think prefix tuning is changing K V.
So is it actually your inventions? to my knowledge,prefix tuning is like deep visual prompt tuning in jia's paper, which proposed to prepend the x at each layer,not KV.

I found it worth noting that your work is utilizing KV cache that hf transformefs would have as an important implementation predicate. is it also a contribution?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions