query GetCurrentUser {
viewer {
sessionToken
user {
objectId
}
}
}
and this is my header
{"X-Parse-Application-Id":"myAppId","X-Parse-Session-Token":"r:75db68f1cc3....a5fbe43b718bf3f4"}
for some reason it fails at here
const response = await _rest.default.find(config, context.auth, '_User',
// Get the user it self from auth object
{
objectId: context.auth.user.id
}, options, info.clientVersion, info.context);
the value for context.auth.user is undefined! I tried to debug it but seems this value is comming from req.auth! I am wondering if there is any bugs somewhere?
android: {
apiKey: '...'
},
which is not consistent with what’s explained on the push-adapter repo
android: {
firebaseServiceAccount: __dirname + '/firebase.json'
}
]]>In src/Adapters/Auth/keycloak.js :
* @param {Array} [authData.roles] - The roles assigned to the user in Keycloak (optional).
* @param {Array} [authData.groups] - The groups assigned to the user in Keycloak (optional).
means groups and roles keys in authData are optional.
But in the code :
if (
response &&
response.data &&
response.data.sub == id &&
arraysEqual(response.data.roles, roles) &&
arraysEqual(response.data.groups, groups)
) {
return;
}
arraysEqual for roles and groups, which are undefined if not present in authData.There should be something like :
if (
response &&
response.data &&
response.data.sub == id &&
(typeof roles === undefined || arraysEqual(response.data.roles, roles)) &&
(typeof groups === undefined || rraysEqual(response.data.groups, groups))
) {
return;
}
But after some tries, I found out that in my responses from keycloak userinfo endpoint, I didn’t get a data key. In fact it works with :
if (
response &&
response.sub == id &&
(typeof roles === undefined || arraysEqual(response.roles, roles)) &&
(typeof groups === undefined || arraysEqual(response.groups, groups))
) {
return;
}
I used this code in a custom auth adapter, but it might be useful to fix it upstream.
In fact I have also tried to use oauth2 adapter, and unfortunately I think it does not work.
The code in src/Adapters/Auth/oauth2.js :
* {
* "auth": {
* "oauth2Provider": {
...
* }
* }
*
const response = await fetch(this.tokenIntrospectionEndpointUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
...(this.authorizationHeader && {
Authorization: this.authorizationHeader
})
},
body: new URLSearchParams({
token: accessToken,
})
});
but it didn’t work at all, response allways returned with ‘401: Unthauthorized’.
I got it working with
* {
* "auth": {
* "oauth2": {
...
* }
* }
*
and this implementation
const response = await fetch(this.tokenIntrospectionEndpointUrl, {
method: 'POST',
headers: {
'Content-Type': 'application/x-www-form-urlencoded',
...(this.authorizationHeader && {
Authorization: this.authorizationHeader
})
},
body: new URLSearchParams({
token: accessToken,
client_secret: "***redacted***",
client_id: "myclient"
})
});
with a Keycloak client myclient configured for client authentication.
Any opinion on the matter ? I believe keycloak and oath2 auth providers are not widely used as I did not find a lot of help on this subject.
]]>parse-dashboard:7.5.0.CloudCode
Parse.Cloud.define("testOpenAI", async (request) => {
const OpenAIResponse = Parse.Object.extend("OpenAIResponse");
const openAIResponse = new OpenAIResponse();
openAIResponse.set('response', '');
await openAIResponse.save();
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
var tokenCount = 0;
const stream = openai.responses
.stream({
model: 'gpt-5-nano-2025-08-07',
instructions: 'You are a storyteller that talks like a pirate.',
input: 'Tell a ten word story about a tardigrade.',
stream: true,
})
.on("response.output_text.delta", (event) => {
openAIResponse.set('response', `${openAIResponse.get('response')}${event.delta}`);
tokenCount++;
openAIResponse.set('tokenCount', tokenCount);
openAIResponse.save();
})
.on("response.error", (event) => {
console.error(event.error);
});
});
Relevant Client Code
liveQueryClient.open();
const query = new Parse.Query("OpenAIResponse");
const subscription = liveQueryClient.subscribe(query);
subscription.on("update", data => {
res.innerHTML = data.get('response');
});
async function testApp() {
try {
await Parse.Cloud.run("testOpenAI");
} catch (e) {
console.log(`testOpenAI failed - ${e}`);
}
};
testApp().then(()=>{
console.log('testApp completed')
});
]]>Parse Server 3.2.3
NodeJs 10.15.1
Host Nodechef
Our /files endpoint works for the mostpart, except when we add some images, the subsequent GET request fails due to missing CORP headers.
Our Web client sets headers COEP & COOP to enable OPFS for SQLite:
But the last changes were in nov 2021.
I was wondering whether I could adapt an Android Kotlin app to use Kotlin Multiplatform instead, but the lack of Parse for Kotlin Multiplatform would be holding me back, if there isn’t more than a proof-of-concept used by possibly nobody.
]]>I think this bug is still not fixed.
I however have another problem here web socket trace and log are not printed in Xcode console.
]]>CREATE TABLE test_questions (
id SERIAL PRIMARY KEY,
content TEXT NOT NULL,
embedding VECTOR(512)
);
]]>this is the code
import ParseLiveQuery
import ParseCore
class LiveQueryManager {
static let shared = LiveQueryManager()
private var client: ParseLiveQuery.Client
private var TestObjectSubscription: Subscription<PFObject>?
let TestObjectQuery = PFQuery(className: "TestObject")
init() {
client = ParseLiveQuery.Client(
server: "http://192.168.2.1:1337",
applicationId: "parseAppID",
clientKey: "parseClientID"
)
client.shouldPrintWebSocketLog = true
client.shouldPrintWebSocketTrace = true
TestObjectSubscription = client.subscribe(TestObjectQuery) //client.subscribe(query)
TestObjectSubscription!.handle(Event.created) { _, book in
if let title = book["title"] as? String, let id = book.objectId {
print("Book \(title) was added")
}
}
TestObjectSubscription!.handle(Event.updated) { _, book in
if let title = book["title"] as? String, let id = book.objectId {
print("Book \(title) was updated")
}
}
TestObjectSubscription!.handle(Event.deleted) { _, book in
if let title = book["title"] as? String, let id = book.objectId {
print("Book \(title) was deleted")
}
}
TestObjectSubscription!.handleSubscribe { query in
print("TestObject subscription status changed to \(query)")
}
TestObjectSubscription!.handleSubscribe { TestObjectQuery in
print("Subscribed to Live Query for TestObject updates")
}
TestObjectSubscription!.handle(LiveQueryErrors.InvalidJSONError.self) { query, error in
print("InvalidJSONError \(error)")
}
TestObjectSubscription!.handle(LiveQueryErrors.InvalidResponseError.self) { query, error in
print("InvalidResponseError \(error)")
}
TestObjectSubscription!.handle(LiveQueryErrors.InvalidQueryError.self) { query, error in
print("InvalidQueryError \(error)")
}
TestObjectSubscription!.handle(LiveQueryErrors.InvalidJSONObject.self) { query, error in
print("InvalidJSONObject \(error)")
}
TestObjectSubscription!.handle(LiveQueryErrors.ServerReportedError.self) { query, error in
print("ServerReportedError \(error)")
self.client.reconnect()
}
ParseLiveQuery.Client.shared = client
}
}
]]>Is there any way I could use the pgvector and embedding column in Parse Server with regard to pgsql?
Thanks v much.
]]>I have enabled pgvector extension in my pgsql server, which is the database for my Parse Server.
Is pgvector extension supported in Parse Server Object? And how can I add embedding column in the schema?
Thanks v much
]]>function MyComponent() {
useEffect(() => {
let subscription;
const tagQuery = new Parse.Query("Stables");
tagQuery.subscribe().then((subs) => {
subscription = subs;
// subs.on(...)
});
// component unmount // Here! Critics!
return () => {
if (subscription) subscription.unsubscribe();
};
}, []);
]]>We are running a legacy app with Parse JS SDK v3.4.4 and have noticed a performance issue: each user instance spawns multiple WebSocket (LiveQuery) connections — typically 8–10 connections visible in the browser network tab. This creates unnecessary load on our reverse proxy, since every connection establishes a separate TLS/TCP channel.
We’ve already ensured that:
Parse.initialize is only called once at app startup.Parse instance is imported as a singleton and shared across all frontend components.Yet, multiple ws://.../socket.io connections are still created.
It seems every parseObject query creates a new ws client.
This is our singleton sample code
import Parse from "parse/dist/parse.min.js";
const APPLICATION_ID = "APPTEST";
const SERVER_URL = process.env.REACT_APP_SERVER_URL;
function initializeParse() {
if (!window._PARSE_CLIENT_INITIALIZED) {
Parse.initialize(APPLICATION_ID);
Parse.serverURL = SERVER_URL;
const basePath = process.env.REACT_APP_LIVEQUERY_SERVER_URL || "/socket.io";
const protocol = window.location.protocol === "https:" ? "wss:" : "ws:";
const host = window.location.host;
if (process.env.NODE_ENV === "production") {
Parse.liveQueryServerURL = `${protocol}//${host}${basePath}`;
} else {
Parse.liveQueryServerURL = "ws://localhost:1335/";
}
window._PARSE_CLIENT_INITIALIZED = true;
console.log("Parse initialized");
}
}
initializeParse();
export default Parse;
Then on react component:
import Parse from "../parseConfig.js";
const DB_Timetables = Parse.Object.extend("Timetables");
let tagQuery = new Parse.Query("Stables");
tagQuery.subscribe().then((subs) => {
subs.on("open", () => {
tagQuery.notEqualTo("tTags", 0);
tagQuery.find().then((results) => {
if (results.length !== 0) {
let tally = [0, 0, 0, 0, 0, 0, 0, 0];
results.forEach((result) => {
const tags = result.get("tTags");
if ((tags & tTagInfo.t1) !== 0) ++tally[0];
if ((tags & tTagInfo.t2) !== 0) ++tally[1];
if ((tags & tTagInfo.t3) !== 0) ++tally[2];
if ((tags & tTagInfo.t4) !== 0) ++tally[3];
if ((tags & tTagInfo.t5) !== 0) ++tally[4];
if ((tags & tTagInfo.t6) !== 0) ++tally[5];
if ((tags & tTagInfo.t7) !== 0) ++tally[6];
if ((tags & tTagInfo.t8) !== 0) ++tally[7];
});
this.setState({ tTagTally: tally });
}
});
});
subs.on("update", (object) => {
tagQuery.notEqualTo("tTags", 0);
tagQuery.find().then((results) => {
if (results.length !== 0) {
let tally = [0, 0, 0, 0, 0, 0, 0, 0];
results.forEach((result) => {
const tags = result.get("tTags");
if ((tags & tTagInfo.t1) !== 0) ++tally[0];
if ((tags & tTagInfo.t2) !== 0) ++tally[1];
if ((tags & tTagInfo.t3) !== 0) ++tally[2];
if ((tags & tTagInfo.t4) !== 0) ++tally[3];
if ((tags & tTagInfo.t5) !== 0) ++tally[4];
if ((tags & tTagInfo.t6) !== 0) ++tally[5];
if ((tags & tTagInfo.t7) !== 0) ++tally[6];
if ((tags & tTagInfo.t8) !== 0) ++tally[7];
});
this.setState({ tTagTally: tally });
}
});
});
});
Can help me on this please if there is wrong pattern in using the library?
]]>I’m working with a nested class structure and need help crafting an efficient query
Property class → has pointer to ProjectProject class → has pointer to Developer and ProjectStatusI want to filter Property objects based on:
ProjectStatus of the related ProjectDeveloper of the related Projectconst developerQuery = new Parse.Query(‘Developer’);
developerQuery.equalTo(‘objectId’, developerId);
const projectQuery = new Parse.Query(‘Project’);
projectQuery.matchesQuery(‘developer’, developerQuery);
const propertyQuery = new Parse.Query(‘Property’);
propertyQuery.matchesQuery(‘project’, projectQuery);
so what is the correct way to undefined a property when saving an object?
- (void)_setObject:(id)object forKey:(NSString *)key onlyIfDifferent:(BOOL)onlyIfDifferent {
PFParameterAssert(object != nil && key != nil,
@"Can't use nil for keys or values on PFObject. Use NSNull for values.");
]]>resend dependency in your own app, and just use the payload converter to feed it the right payload. It can be HTML, react, or whatever the resend package expects. ]]>All that’s needed for Resend seems to be to add a very lightweight (few lines of code) payload converter. The Resend npm package would not be added to the adapter but the adapter would relay your payload to it. Whatever payload you use (HTML, React code, etc) should work. We intentionally designed the adapter to be that flexible. Maybe take a look at how the existing payload converters work to get a better idea.
]]>resend package, which probably wouldn’t be necessary if you directly used the Resend REST API with our mail adapter and a payload converter. ]]>Could anyone let us know if its available in Android and iOS or an alternate way to do it.
]]>I just published a new Parse Server email adapter for Resend:
parse-server-resend-adapter
It supports customizable React Email templates for verification and password reset, easy branding, and TypeScript.
Check it out: npmjs.com/package/parse-server-resend-adapter
Feedback and contributions welcome!
]]>Offtopic
Can you show how you connected to postgres with parse server?
I’m not able to connect with this URL.
const DB_URL_PROD = "postgres://parse:[email protected]:5433/parse"
let parse = new ParseServer({
appId: "impressionserver",
masterKey: "impressionserver",
appName: "ImpressionServer",
cloud: "./cloud/main",
databaseURI: DB_URL_PROD,
]]>Firstly, sorry for posting here if this isn’t the correct place to do so.
I would like to have my account deleted (or anonymized) as well as all my data. Due to no longer using this forum and for privacy reasons.
Thank you in advance.
]]>/aggregate endpoint with a GET request.For example, I have a collection called Posts and each object has an ACL field that looks like this:
{
"ACL": {
"J2QgVTpjig": { "read": true },
"*": { "read": false }
},
"title": "Post title",
"content": "Some content"
}
I want to use aggregate with $match so that it only returns documents where the ACL allows the current user (objectId: J2QgVTpjig) to read, or where it’s publicly readable.
I wrote this pipeline:
[
{
"$match": {
"$or": [
{ "ACL.J2QgVTpjig.read": true },
{ "ACL.*.read": true }
]
}
},
{
"$project": {
"title": 1,
"content": 1,
"createdAt": 1
}
}
]
and sent it like this:
GET /parse/aggregate/Posts?pipeline=<url-encoded pipeline>
Is this the recommended way to enforce ACL restrictions when using /aggregate in Parse Server?
Because normally, when using find or get, Parse automatically filters results by ACL. But it seems that aggregate does not apply ACL checks automatically, so I have to do it manually inside $match.
Is that correct?
Or is there a more secure / recommended way to ensure ACL is respected in aggregate pipelines?
find automatically filters by ACL, but does aggregate?$match manually for ACL in the pipeline when using /aggregate GET endpoint?Thanks a lot for any help or best practices!
I apply the documentation:
https://parseplatform.org/parse-server/api/release/LDAP.html#LDAP
export const parseServer = ParseServer({
databaseURI: config.DATABASE_URI,
cloud: config.CLOUD_PATH,
serverURL: config.SERVER_URL,
logsFolder: './logs',
publicServerURL: config.SERVER_PUBLIC_URL,
appName: config.APP_NAME,
appId: config.APPLICATION_ID,
masterKey: config.MASTER_KEY,
masterKeyIps: config.MASTER_KEY_IPS,
maintenanceKey: process.env.MAINTENANCE_KEY || 'myMaintenanceKey', // Required property
//masterKeyIps: ['::1', '127.0.0.1','192.168.50.164', '192.168.50.229', '192.168.1.51'],
allowClientClassCreation: true,
encodeParseObjectInCloudFunction: true,
verbose: true,
logLevel: 'silly',
auth: {
ldap: {
url: "ldaps://ipa.test.dev:636",
suffix: "cn=users,cn=accounts,dc=zucca,dc=dev",
dn: "uid={{id}},cn=users,cn=accounts,dc=zucca,dc=dev",
groupCn: "ipausers",
groupFilter: "(memberUid={{dn}})"
}
},
javascriptKey: 'test123',
verifyUserEmails: false,
emailVerifyTokenValidityDuration: 2 * 60 * 60,
liveQuery: {
classNames: ['ItemsTest', 'ItemsTest'],
},
});
but when running:
export async function login(username: string, password: string): Promise<User> {
const user = await Parse.User.logInWith('ldap', {
id: username,
password: password
});
return parseUser(user);
}
either:
Invoke-RestMethod -Uri "https://pcb.test.dev/parse/users" `
>> -Method Post `
>> -Headers @{
>> "X-Parse-Application-Id"="myAppId";
>> "X-Parse-REST-API-Key"="Test123";
>> "X-Parse-Javascript-Key"="Test123";
>> "Content-Type"="application/json"
>> } `
>> -Body '{"authData":{"ldap":{"id":"TestUser","password":"TEst.1234"}}}'
mistake:
Invoke-RestMethod:
{
"code": 1,
"error": "LDAP auth configuration missing"
}
What am I doing wrong? Should I install something else? Could you help me? Thank you.
]]>I am currently experimenting this change to the push adapter code:
]]>is it OK for postgresql?
]]>Thank you!
]]>I made good progress, yet I ran into an issue that I did not see (or missed) in the breaking changes and want to ask here.
In my cloud code before find / after find triggers I’ve noticed that I cannot use the ParseObjectSubclass supplied in the context under objects is no longer usable as a pointer in ParseQuery.
An example object like this was usable in v 5.6.0 and was cast to a pointer correctly:
ParseObjectSubclass {
className: 'Cube',
_objCount: 2385,
id: 'VOD-052755'
}
However now in v8 unless I do .toPointer() on the object, I cannot use it in pointer queries.
Since I have a lot of calls throughtout my codebase and I don’t want to have to add .toPointer() on all of these, any ideas on what I can do?
I tried setting the encodeParseObjectInCloudFunction option to false but this was not the culprit.
Would appreaciate any input! Thanks!
]]>Parse.initialize(process.env.APP_ID, process.env.JS_KEY, process.env.MASTER_KEY);
and now it’s working ![]()
Error: {"message":"Cannot use the Master Key, it has not been provided.","code":141}
Prior to upgrading to 3.x I was able to pass {useMasterKey: true} in the push call but now this doesn’t work.
Sorry I just saw your response from some months ago.
Aside from the cloud functions I have defined in cloud/main.js, I am also defining cloud code functions outside of that file. For example I have one file as follows located in the models folder:models/Incident.js Parse.Cloud.afterSave("Incident", (request) => { // do something after saving the object }This was working fine before updating to parse-server 3.0 but now that I have upgraded I see the following error when starting up my parse server:
TypeError: Parse.Cloud.afterSave is not a functionIf I move the afterSave trigger to cloud/main.js then the error goes away and the afterSave trigger works. I want to avoid doing this since I have a lot of classes that share the same style of code and that would require me to move multiple afterSave triggers to main.js.
For example, I’m using like this:
cloud/main.js
const {Parse} = require("parse");
require('./triggers/test')
cloud/triggers/test.js
Parse.Cloud.beforeDelete("test", async (req) => {
console.log("beforeDelete",req.object);
});
]]>models/Incident.js
Parse.Cloud.afterSave("Incident", (request) => {
// do something after saving the object
}
This was working fine before updating to parse-server 3.0 but now that I have upgraded I see the following error when starting up my parse server:
TypeError: Parse.Cloud.afterSave is not a function
If I move the afterSave trigger to cloud/main.js then the error goes away and the afterSave trigger works. I want to avoid doing this since I have a lot of classes that share the same style of code and that would require me to move multiple afterSave triggers to main.js.
]]>or you could also try the Mongo API Adapter
I wrote it about it here with Parse Server
Note that while we helped in developing the testing framework for the adapter, it is a 3rd party adapter, hosted and maintained by Oracle.
]]>