-
Notifications
You must be signed in to change notification settings - Fork 303
Better dynamic query building #12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi. Thanks :) Yeah, it can take a while to wrap your head around tagged template functions, but once you do you start seeing great usage for them everywhere (at least i did) ;). The first thing is to understand the difference between tagged template functions and just regular template literal strings. Tagged template functions allows you to handle the values Postgres.js converts normal values and replaces their position in the query with That mens you can't pass keywords and identifiers directly as raw string arguments when calling the To still allow more dynamic queries than where postgres accepts positional arguments ( I want to improve the documentation on that end, and already have this issue #4 open to track it. I hope that helps, and thanks for the good description of your concerns :) |
If you can tell me where in the docs it seemed like this was allowed, I'd love to make it more clear that, that is not possible. |
@porsager Thanks for your reply much appreciated. Funny you mention the difference as I had to go re-read the docs about them and had a play about and realised "oh" thats how it works. But what I did not know was Postgres already does the positional arguments for you. I somehow believed this was always done on the client/library side. Which explains why I cant build my statement outside of the tagged template method the way I thought I could. But regarding your sql function. The example shows for an for example, here is a basic one.
But there will be times based on different situations where I will have to programmatically generate a queries using partial fragments. for example.
Not able to do it like this. |
What about |
That's a great case @okbrown .. It'd be easy to just defer and disallow dynamic queries of that kind, but I'd like for this library to have a safe ergonomic way to support it. @Minigugus is right about I'll leave this issue open, and I'll try to think some more about it. |
Same problem here but I want to insert data in table named by templated var : await sql `
insert into ${measure.company} (
label, value
) values (
${measure.label}, ${measure.value}
)
` But I get the same error :
|
Hi @throrin19 Any sql identifier or keyword must be added with await sql `
insert into ${ sql(measure.company) } (
label, value
) values (
${ measure.label }, ${ measure.value }
)
` The following would also be valid: await sql `
insert into ${ sql(measure.company) } ${
sql(measure, 'label', 'value')
}
` |
Same problem here. I have this build request (I simplify here) : const select = 'avg(value)';
const where = `
firstWhere = 'value' AND
secondWhere = 'value'`;
const result = sql`SELECT ${ sql(select) } FROM "table" WHERE ${ sql(where) }`; I have this generated SQL query : SELECT "avg(value)" FROM "table" WHERE "
firstWhere = 'value' AND
secondWhere = 'value'" I try this without How to do that correctly ? |
This comment was marked as off-topic.
This comment was marked as off-topic.
Lol :D I love your ideas @Minigugus, but then It's actually the entire reason it's there... So you don't have to hack through the tagged template function :P |
This comment was marked as off-topic.
This comment was marked as off-topic.
I think, for all this cases, postgres lib must have functions to do this. Like I understand it's not the idea of this library but, if you want to create dynamically select, insert, ... the only available option is the |
Oh, I'd like to support it, but it has to be done properly, so I'm not rushing it ;) I'll get around to it soon enough. |
I didn't read the full thread but do have some ideas: https://www.npmjs.com/package/prepare-sql I made this module because I needed to string together pieces of queries ad-hoc. What it does is that the template functions checks if the value to interpolate is another SQL object, and then merges the ends of the string parts together, and splices in the free variables. Then it's easy to split up the variable parts and they are automatically spliced in when run. Perhaps the same technique could be used here? |
Hi, is there any progress on this issue apart from the dynamic table name? |
Hello, thanks for your hard work. In recent work, I need a dynamic "Where" condition. So I read documentation, there exist part about it I need some dynamic "where" where I can remove some clauses, like that,
Is it possible? Or i need to use sql.unsafe? I prefer not to use unsafe. thank you |
do you have any opinion on a syntax similar to slonik? it seems an API like await sql`SELECT * FROM statistic WHERE 1=1 ${ userId && sql.partial('AND user_id = ?', userId) }` in slonik they provide more api, like: Do you think this kind of syntax would fit into the scheme of this project? maybe such helpers should be in third party modules? but the kind of feature seems to be asked for very often. |
@samyisok thanks a lot. This will be possible in v2, but currently it's not.. You can achieve it like this though: sql`
SELECT
*
FROM statistic
WHERE 1 = 1
AND (true = ${ userId ? false : true } OR user_id = ${ userId })
` @tnickel001 yeah, this is basically what I've got in the works for 2.0, but in a simpler and safe form. I'm sorry it's dragging out, but work and another recent project is taking up all my time. I've got vacation in a couple of weeks, so there's a good chance I have some time for this again :) |
Hi, await sql`SELECT * FROM statistic WHERE 1=1 ${ userId ? sql.partial`AND user_id = ${userId}` : sql.skip }`
// or
const andUserId = userId ? sql.partial`AND user_id = ${userId}` : sql.skip
await sql`SELECT * FROM statistic WHERE 1=1 ${andUserId}` Maybe someone could give it a try. |
Anything new on this feature request? Do you accept PR on this? @porsager |
@yckao |
@yckao Thanks, man |
This comment has been minimized.
This comment has been minimized.
Using this one for quite some time know const isNil = value => value === undefined || value === null;
/**
* Format and append query parts, and execute the final result in a safe way.
* Undefined values are skipped, as they are not allowed in queries.
* The query call may be one of the interpolated values. Supports being called as a template literal.
*
* @since 0.1.0
*
* @template {*} T
* @param {string[]} strings
* @param {...*} values
* @returns {QueryPart<T>}
*/
export function query(strings, ...values) {
let _strings = [];
const _values = [];
const result = {
get strings() {
return _strings;
},
get values() {
return _values;
},
append,
exec,
};
// Flatten nested query parts
let didFlatten = false;
for (let i = 0; i < strings.length - 1; ++i) {
if (didFlatten) {
didFlatten = false;
_strings[_strings.length - 1] += strings[i];
} else {
_strings.push(strings[i]);
}
if (isQueryPart(values[i])) {
append(values[i]);
didFlatten = true;
} else {
_values.push(values[i]);
}
}
if (didFlatten) {
_strings[_strings.length - 1] += strings[strings.length - 1];
} else {
_strings.push(strings[strings.length - 1]);
}
return result;
function append(query) {
const last = _strings[_strings.length - 1];
const [first, ...rest] = query.strings;
_strings = [..._strings.slice(0, -1), `${last} ${first}`, ...rest];
_values.push.apply(_values, query.values);
return result;
}
async function exec(sql) {
let str = _strings[0];
let valueIdx = 1;
for (let i = 0; i < _values.length; ++i) {
if (_values[i] === undefined) {
str += `${_strings[i + 1]}`;
} else {
str += `$${valueIdx++}${_strings[i + 1]}`;
}
}
// Strip out undefined values
return await sql.unsafe(
str,
_values.filter((it) => it !== undefined),
);
}
}
/**
* Check if the passed in value is an object generated by 'query``'.
*
* @since 0.1.0
*
* @param {*} query
* @returns {boolean}
*/
export function isQueryPart(query) {
return (
!isNil(query) &&
Array.isArray(query?.strings) &&
Array.isArray(query?.values)
);
} which abuses the fact that 'undefined' may not be passed in as query parameters. const sql = postgres()
const where = query`foo = ${5}`;
const concatWhere = query`type = ${"myType"} AND `.append(where);
const offset = 15;
const queryResult = await query`SELECT * FROM "bar" WHERE ${concatWhere} OFFSET ${offset}`.exec(sql); It supports both |
I hope to live up to the moment when it will be in main |
I am also waiting for this feature. I hope it will land in V2. |
I have an implementation that looks promising in my opinion. Still would need to be fleshed out though. It allows nesting sql, sql[] and therefore creates some strong primitives for arbitrary extensions. /**
*
* @param {string} string
* @returns {[TemplateStringsArray, ...any[]]}
*/
export function unsafe2(string) {
/** @type {import("../../lib/types").WritableTemplateStringsArray} */
let r = [string];
r.raw = [string];
return [/** @type {TemplateStringsArray} */ (r)];
}
/**
*
* @param {TemplateStringsArray} _strings
* @param {...(string|[TemplateStringsArray, ...(string|string[]|boolean|number)[]]|[TemplateStringsArray, ...(string|string[]|boolean|number)[]][]|string[]|boolean|number)} _keys
* @returns {[TemplateStringsArray, ...(string|string[]|boolean|number)[]]}
*/
export function sql2(_strings, ..._keys) {
const strings = _strings;
const keys = _keys;
/** @type {import("../../lib/types").WritableTemplateStringsArray} */
const r = [""];
r.raw = [""];
/** @type {import("../../lib/types").WritableTemplateStringsArray} */
const rd = ["", ""];
rd.raw = ["", ""];
const stringsAsTemplates = strings.map(unsafe2)
// array of templates
/** @type {[TemplateStringsArray, ...(string|string[]|boolean|number)[]][]} */
const flattened = stringsAsTemplates.flatMap((m, i) => {
if (i == keys.length) {
return [m];
}
// array of flat template strings
if (Array.isArray(keys[i]) && keys[i].every(p => Array.isArray(p) && typeof p[0] === "object" && 'raw' in p[0])) {
return [m, ...keys[i]]
}
// flat template string
if (Array.isArray(keys[i]) && typeof keys[i][0] === "object" && 'raw' in keys[i][0]) {
return [m, keys[i]];
}
// primitive
return [m, [rd, keys[i]]];
})
const result = flattened.reduce((previous, current) => {
/** @type {import("../../lib/types").WritableTemplateStringsArray} */
const templateStrings = [
...previous[0].slice(0, -1),
previous[0].slice(-1)[0] + current[0][0],
...current[0].slice(1)
];
templateStrings.raw = [
...previous[0].raw.slice(0, -1),
previous[0].raw.slice(-1)[0] + current[0].raw[0],
...current[0].raw.slice(1)
];
return /** @type {[TemplateStringsArray, ...any[]]} */ ([/** @type {TemplateStringsArray} */ (templateStrings), ...previous.slice(1), ...current.slice(1)]);
}, /** @type {[TemplateStringsArray, ...any[]]} */ ([r]));
return result
}
/**
*
* @param {[TemplateStringsArray, ...(string|string[]|boolean|number)[]]} sql
*/
export function sql2ToString(sql) {
return sql[0].map((s, i) => {
if (i+1 == sql.length) {
return s;
}
return s + JSON.stringify(sql[i+1]);
}).join("")
}
console.log(sql2`SELECT * FROM test`)
console.log(sql2`SELECT ${"hill"}`)
console.log(sql2`SELECT ${sql2`* FROM test`} WHERE ${1}`)
/** @type {any[]} */
let list = ["id", "title", "info"];
console.log(sql2`SELECT "id", "title", "info", "place" FROM projects WHERE 1${list.map(
(v) => sql2` AND (${unsafe2(v)} < ${1})`
)} OR NOT ... params() ORDER BY ${list.map(
(v) => sql2`${unsafe2(v)} ASC, `
)} LIMIT 1337`); A completely wrong use (but I still want to show the idea) is the following: import { sql } from "./database.js";
import { sql2, sql2ToString, unsafe2 } from "./sql/index.js";
/**
* @template T
* @param {string} table
* @param {string[]} fieldsToSelect
* @param {{ [field: string]: 'nulls-first'|'nulls-last' }} orderByInfo
* @param {import("../lib/types").BaseQuery<T>} _query
* @param {T} sanitizedData
* @param {(query: T) => [TemplateStringsArray, ...(string|string[]|boolean|number)[]]} customFilterQuery
* @returns {[TemplateStringsArray, ...(string|string[]|boolean|number)[]]}
*/
export function fetchData(
table,
fieldsToSelect,
orderByInfo,
_query,
sanitizedData,
customFilterQuery
) {
const query = _query;
// orderBy needs to be reversed for backwards pagination
if (query.paginationDirection === "backwards") {
query.sorting = query.sorting.map((v) => [
v[0],
v[1] === "ASC" ? "DESC" : "ASC",
]);
}
if (query.sorting.length === 0) {
query.sorting = [["id", "ASC"]];
}
const orderByQuery = query.sorting
.flatMap((v) => [sql2`,`, sql2`${unsafe2(v[0])} ${unsafe2(v[1])}`])
.slice(1);
if (query.paginationCursor === null) {
return sql2`(SELECT ${unsafe2(fieldsToSelect.join(", "))} FROM ${unsafe2(
table
)} WHERE ORDER BY ${orderByQuery} LIMIT ${query.paginationLimit + 1})`;
} else {
let queries = query.sorting.map((value, index, array) => {
const part = query.sorting.slice(0, index + 1);
let parts = part
.flatMap((value, index) => {
return [
sql2` AND `,
sql2`${query.paginationCursor[value[0]] ?? null} ${
index === part.length - 1
? value[1] === "ASC"
? sql2`<`
: sql2`>`
: sql2`IS NOT DISTINCT FROM`
} ${unsafe2(value[0] ?? null)}`,
];
})
.slice(1);
return sql2`(SELECT ${unsafe2(fieldsToSelect.join(", "))} FROM ${unsafe2(
table
)} WHERE${customFilterQuery(
sanitizedData
)} (${parts}) ORDER BY ${orderByQuery} LIMIT ${
query.paginationLimit + 1
})`;
});
return sql2`${queries
.flatMap((v) => [sql2`\nUNION ALL\n`, v])
.slice(1)} LIMIT ${query.paginationLimit + 1}`;
}
}
const value = fetchData(
"users",
["id", "type", "username", "password_hash"],
{
id: "nulls-first",
type: "nulls-first",
username: "nulls-first",
password_hash: "nulls-first",
},
{
filters: {},
paginationCursor: { username: "aaaa", type: "helper" },
paginationDirection: "forwards",
paginationLimit: 10,
sorting: [
["type", "DESC"],
["username", "ASC"],
["id", "DESC"],
],
},
{
name: "test",
},
(query) => {
return sql2``;
}
);
console.log(value)
console.log(sql2ToString(value))
console.log(await sql(...value));
await sql.end() |
That's the query function I'm using in a project. The code is already quite optimized. example.jsconst {sql, raw} = require('./db')
async function start () {
// using a placeholder, similar to knex
const rawWhere = db.raw('WHERE id = ?', [1])
const res = await db.sql`SELECT users ${rawWhere}`
// Using a tag function
const where = db.raw.sql`WHERE id = ${1}`
// append using `.sql`
where.sql` AND admin = true`
// You can nest raw instances or queries, we have a `db.identifier` function that returns a Raw instance
const dynamicColumn = 'something'
where.sql` AND ${db.raw(dynamicColumn)} = ${1}`
const res2 = await db.sql`SELECT users ${where}`
const query = db.sql`SELECT ${1}`
// Supports embedding all values
// This is useful if you want to execute multiple statements within the same query
query.toText()
// returns 'SELECT 1'
// This is the query that's behind a regular parameterized statement
// There's a custom serialize logic for arrays to keep compatibility with `pg`
query.toQuery()
// returns {text: 'SELECT $1', values: [1]}
// Arrays are serialized to postgres arrays
db.sql`SELECT ${[1, 2, 3]}::int[]`.toText()
// returns `SELECT '{1,2,3}'::int[]`
// If you want to enforce json serialization, use `::json` or `::jsonb`
db.sql`SELECT ${[1, 2, 3]}::json`.toText()
// returns `SELECT '[1,2,3]'::json`
}
start() db.jsconst postgres = require('postgres')
const {Query, Raw} = require('./query')
const client = postgres()
const sql = (strings, ...values) => new Query(strings, values, client)
const raw = (strings, values) => new Raw(strings, values)
raw.sql = (strings, ...values) => new Raw(strings, values)
module.exports = {sql, raw} query.js'use strict'
class SQLParameterError extends Error {
name = 'SQLParameterError'
constructor (opts) {
super(opts.message)
if (opts.text) this.text = opts.text
if (opts.values) this.values = opts.values
}
}
class Raw {
constructor (text, values) {
this.texts = []
this.values = []
switch (typeof text) {
case 'undefined':
break
case 'string':
if (values?.length) _appendToQuery(this, text.split('?'), values)
else this.texts.push(text)
break
case 'object':
_appendToQuery(this, text, values)
break
default:
throw new TypeError(`The parameter 'text' must be an array of strings or a string.`)
}
}
prefix (strings, ...values) {
if (typeof strings === 'string') {
this.texts[0] = `${strings}${this.texts[0] || ''}`
} else {
strings[0] = `${strings[0]}${this.texts[0] || ''}`
this.texts.splice(0, 1, ...strings)
if (values) this.values.unshift(...values)
}
}
sql (strings, ...values) {
_appendToQuery(this, strings, values)
return this
}
freeze () {
Object.freeze(this)
Object.freeze(this.values)
return this
}
}
function _appendToQuery (self, strings, values) {
if (strings.length === 0) return this
if (self.texts[0] === undefined) self.texts.push(strings[0])
else self.texts[self.texts.length - 1] += strings[0]
for (let i = 0; i < values.length; i++) {
const value = values[i]
const after = strings[i + 1] || ''
if (value === undefined) {
throw new SQLParameterError({
message: `The parameter type '${typeof value}' is not supported in db.sql`,
text: strings.join('?'),
values
})
}
if (value instanceof Raw || value instanceof Query) {
if (!value.texts.length) {
self.texts[self.texts.length - 1] += after
continue
}
self.texts[self.texts.length - 1] += value.texts[0]
if (value.texts.length > 1) self.texts.push(...value.texts.slice(1))
self.texts[self.texts.length - 1] += after
if (value.values) self.values.push(...value.values)
continue
}
self.values.push(value)
self.texts.push(after)
}
}
class Query {
constructor (strings, _values, client) {
this.texts = []
this.values = []
this.client = client
_appendToQuery(this, strings, _values)
}
sql (strings, ...values) {
_appendToQuery(this, strings, values)
return this
}
async _queryPostgres () {
const query = this.toQuery(true)
return this.client.unsafe(
query.text,
query.values?.length ? query.values : undefined
)
}
then (resolve, reject) {
return this._queryPostgres().then(resolve, reject)
}
catch (reject) {
return this._queryPostgres().catch(reject)
}
embedded () {
this.toQuery = this.toTextQuery
return this
}
// This is only useful if you want to freeze a partial to prevent modifications
// e.g. db.sql`WHERE = "something"`.freeze()
// no .options, .transacting or query call will be possible after executing this.
freeze () {
Object.freeze(this)
Object.freeze(this.values)
return this
}
toText () {
const strings = this.texts
let text = strings[0]
for (let i = 0; i < this.values.length; i++) {
text += `${serializeToRawLiteral(this.values[i], strings[i + 1])}${strings[i + 1]}`
}
return text
}
toTextQuery () {
return {text: this.toText()}
}
toQuery (nativePlaceholder = true) {
const values = this.values
const strings = this.texts
const text = nativePlaceholder ? fastJoin(strings, values.length) : strings.join('?')
for (let i = 0; i < values.length; i++) {
const value = values[i]
switch (typeof value) {
case 'number':
if (!Number.isFinite(value)) {
throw new SQLParameterError({
message: `NaN and Infinity are not supported in db.sql`,
text,
values
})
}
break
case 'boolean':
case 'bigint':
break
case 'string':
values[i] = value.replace(escapeNull, '')
break
case 'object':
if (value === null) break
else if (value instanceof Date) values[i] = value.toISOString()
else if (value instanceof Buffer) break
else values[i] = serializeObject(value, strings[i + 1]) || serializeArray(value)
break
default:
throw new SQLParameterError({
message: `The parameter type '${typeof value}' is not supported in db.sql`,
text,
values
})
}
}
return {text, values}
}
}
// The string join operation is rather slow compared to the other operations.
// This speeds up the query composition a bit
function createFastText (num) {
// eslint-disable-next-line no-template-curly-in-string
let str = '${strings[0]}'
for (let i = 1; i <= num; i++) str += `$${i}\${strings[${i}]}`
// eslint-disable-next-line no-new-func
return new Function('strings', `return \`${str}\``)
}
const fastText = {}
for (let i = 0; i < 20; i++) fastText[i] = createFastText(i)
function fastJoin (strings, len) {
if (len < 20) return fastText[len](strings)
let text = fastText['19'](strings)
for (let i = 20; i <= len; i++) {
text += `$${i}${strings[i]}`
}
return text
}
// Postgres (the database, not library) can't handle null characters well in strings and json
const escapeNull = /\\u0000/g
const escapeBackslash = /\\/g
const escapeApos = /'/g
const escapeQuote = /"/g
function serializeToRawLiteral (value, after) {
switch (typeof value) {
case 'string':
return serializeRawLiteralString(value)
case 'boolean':
return `${value === true}`
case 'number':
if (Number.isFinite(value)) return value
throw new SQLParameterError({message: `The number '${value}' is not supported in db.sql`})
case 'object':
if (value === null) return 'NULL'
else if (value instanceof Date) return `'${value.toISOString()}'`
else if (value instanceof Buffer) return `decode('${value.toString('base64')}', 'base64')`
else {
const text = serializeObject(value, after)
if (text !== undefined) return serializeRawLiteralString(text)
return escapeForUnicode(serializeArray(value))
}
default:
throw new SQLParameterError({
message: `The parameter type '${typeof value}' is not supported in db.sql`
})
}
}
function serializeObject (value, after) {
if (value[Symbol.iterator]) {
// Maps should be handled like json objects
// Arrays don't apply the same JSON.stringify logic
if (value instanceof Map) {
return jsonStringify(Object.fromEntries(value))
} else {
// With a ${array}::json or ${array}::jsonb, the array is passed as a string.
// Otherwise if an array gets passed, there's a fallback to the serializeArray function
if (!after.startsWith('::json')) return
return jsonStringify(typeof value.length === 'number' ? value : Array.from(value))
}
} else {
return jsonStringify(value)
}
}
function jsonStringify (obj) {
return JSON.stringify(obj).replace(escapeNull, '')
}
// eslint-disable-next-line no-control-regex
const hasEscape = /[\n\f\r\t\u0000-\u00ff]/
function escapeForUnicode (str) {
return `${
hasEscape.test(str) ? 'E' : ''
}'${
str.replace(escapeApos, `''`)
}'`
}
function serializeRawLiteralString (str) {
return `${
hasEscape.test(str) ? 'E' : ''
}'${
str.replace(escapeBackslash, `\\\\`)
.replace(escapeApos, `''`)
.replace(escapeNull, '')
}'`
}
// https://www.postgresql.org/message-id/22396.1270154243%40sss.pgh.pa.us
function serializeStringArrayMember (str) {
return `"${
str.replace(escapeBackslash, `\\\\`)
.replace(escapeQuote, '\\"')
.replace(escapeNull, '')
}"`
}
function serializeArray (arr) {
if (arr.length === undefined) arr = Array.from(arr)
if (arr.length === 0) return '{}'
let str = ''
for (let i = 0; i < arr.length; i++) {
const v = arr[i]
switch (typeof v) {
case 'number':
if (Number.isFinite(v)) {
str += `,${v}`
break
}
throw new SQLParameterError({
message: `NaN and Infinity are not supported in db.sql`,
values: arr
})
case 'bigint':
str += `,${v}`
break
case 'string':
str += `,${serializeStringArrayMember(v)}`
break
case 'boolean':
str += `,${v === true ? 't' : 'f'}`
break
case 'object':
if (v == null) {
str += `,NULL`
break
} else if (v instanceof Date) {
str += `,"${v.toISOString()}"`
break
}
// Serialize regular objects
if (!v[Symbol.iterator]) {
str += `,${serializeStringArrayMember(jsonStringify(v))}`
break
}
// Maps should be handled like json objects
if (v instanceof Map) {
str += `,${serializeStringArrayMember(jsonStringify(Object.fromEntries(v)))}`
break
}
if (v instanceof Buffer) {
str += `,\\\\x${v.toString('hex')}`
break
}
// Arrays apply the native postgres array syntax
str += `,${serializeArray(v)}`
break
default:
throw new Error(`Unsupported array child type: ${typeof v}`)
}
}
return `{${str.slice(1)}}`
}
module.exports = {raw, Raw, Query} |
Hi Marc. Looks interesting. I've been finishing up on my idea (actually ended up as a complete rewrite of all of Postgres.js) over the holidays, so I'll see if there's some clever things to use from your implementation if that's ok? 😊 |
Ohh sounds interesting, maybe a v2 drop sometime in January? (or is this v3 now??) |
Yeah, with the long time this v2 beta has been running, going to v3 might not be a bad idea 😂 As things are shaping up right now I'm hoping to have a release candidate ready for the weekend, and definitely a stable release this month. |
Yes, that's ok. I don't want to keep it for myself 😃 |
FWIW, this has been the main issue I keep running into in this library. I ended up using a combo of Definitely looking forward to v3! |
It's functionally ready, and I just pushed the rewrite branch yesterday 😊 Now what's missing is documentation, examples, finishing typescript types and some more tests for the new features. |
Huzzah! Where can I donate? |
Hehe 🤩 Thanks! https://github.com/sponsors/porsager |
Uh oh!
There was an error while loading. Please reload this page.
Hi, First and foremost, I think this is a great start at making something really simple and effective. Thank you!!
However... I have always found template strings a little odd especially in this context.
Let us say we have
const query = `SELECT ${distinct}, ${columns} FROM ${table} ${innerJoins} WHERE ${where}`;
(Where distinct, columns, table etc are their own strings made up of other functions elsehwhere)
which basically outputs...
I should then be able to do
sql`${query}`.then((data) => console.log(data)
orawait
etc..But instead I receive a syntax error:
Error: syntax error at or near "$1" at Object.generic (../node_modules/postgres/lib/types.js:200:5)
Gonna take a look at how this tagged template function is actually working as Im aware it takes the values $1, $2 etc.. and so forth to evaluate them and pass them to the statement e.g.
select * from table where id = $1
=>select * from table where id = somevalue
At the moment, it means I have to directly place all my select statements big or small in this method. Which will start looking very messy especially when queries become more complex and the app grows.
The text was updated successfully, but these errors were encountered: