For updates, it looks nice I guess.
It shouldn't matter so much, but when you don't use one language as much as you do other languages, it becomes that much harder to remember unfamiliar syntaxes and grammars, and easier to confuse similar looking operations with each other.
Is there even a function that would convert JSON encoded "string" to text it represents in postgresql? I didn't find it.
So all you can do is `SELECT json_col['a'] = some_text_col::jsonb;` and hope for the best (that string encodings will match) or use the old syntax with ->> or #>>.
Oddly, no, there's no specific function for taking a root-level scalar JSON term (like '"foo"'::jsonb), and extracting said scalar to its equivalent native Postgres type.
You can still do it (for extracting to text, at least), but you have to use a 'vacuous' path-navigation to accomplish it, so it's extremely clumsy, and wastes the potential of the new syntax:
SELECT '"foo"'::jsonb #>> (ARRAY[]::text[]);ETA: no, actually, I was wrong — #>> takes text[], so you can already pass it an ARRAY[] literal containing expressions. It's just all the examples in the PG docs that use the IO syntax to represent the path, and then rely on an implicit cast to text[].