Jobs API
The Codex Jobs GraphQL API provides the functionality to create, update, delete, and manage jobs within Codex. Each organization has its own Jobs API GraphQL path: /v2/{organization}/jobs/graphql
which can be used with the main API domain of Codex (api.codex.gjirafa.tech - for production).
Schema
The GraphQL schema used to create, update, delete jobs:
Types
The Codex Jobs GraphQL API defines several types for managing jobs. Below is a detailed description of each type.
Job
type Job {
id: String!
name: String
type: JobType!
fileKey: String
status: JobStatus!
tasks: [JobTask]
recurringJobId: String
finishedAt: DateTime
summary: JobSummary
createdAt: DateTime!
updatedAt: DateTime
createdBy: User
updatedBy: User
}
RecurringJob
type RecurringJob {
id: String!
name: String
type: RecurringJobType!
status: RecurringJobStatus!
cronExpression: String
hasJobRuns: Boolean
baseEnvironment: Environment
targetEnvironment: Environment
createdAt: DateTime!
updatedAt: DateTime
createdBy: User
updatedBy: User
}
JobTaskLog
type JobTaskLog {
id: String
jobTaskId: String
message: String
loggedAt: DateTime!
type: LogType!
jobId: String
}
JobSummary
type JobSummary {
totalLogs: Long
totalErrors: Long
totalWarnings: Long
elapsedTime: TimeSpan!
startedAt: DateTime!
}
JobCollection
type JobCollection {
items: [Job]
offset: Int!
limit: Int!
total: Int!
}
JobTaskLogCollection
type JobTaskLogCollection {
items: [JobTaskLog]
offset: Int!
limit: Int!
total: Int!
}
RecurringJobCollection
type RecurringJobCollection {
items: [RecurringJob]
offset: Int!
limit: Int!
total: Int!
}
Mutations
The API supports various mutations for job management.
Mutation
type Mutation {
initFileUpload(initFileUploadInput: InitFileUploadInput!): InitFileUploadPayload!
completeMultiPartUpload(completeMultipartUploadInput: CompleteMultipartUploadInput!): CompleteMultipartUploadPayload!
createImportJob(createJobInput: CreateImportJobInput!): CreateImportJobPayload!
createExportJob(createJobInput: CreateExportJobInput!): CreateExportJobPayload!
createDeleteJob(createJobInput: CreateDeleteJobInput!): CreateDeleteJobPayload!
createEnvironmentJob(createJobInput: CreateEnvironmentJobInput!): CreateEnvironmentJobPayload!
syncEnvironmentJob(createJobInput: SyncEnvironmentsJobInput!): SyncEnvironmentsJobPayload!
deleteEnvironmentJob(createJobInput: DeleteEnvironmentJobInput!): DeleteEnvironmentJobPayload!
runRecurringJob(createJobInput: RunRecurringJobInput!): RunRecurringJobPayload!
deleteRecurringJob(deleteRecurringJobInput: DeleteRecurringJobInput!): DeleteRecurringJobPayload!
toggleRecurringJobStatus(toggleRecurringJobInput: ToggleRecurringJobStatusInput!): ToggleRecurringJobStatusPayload!
createCloneSiteJob(createJobInput: CreateCloneSiteJobInput!): CreateCloneSiteJobPayload!
}
Enums
Several enums are defined to specify job and task types and statuses.
JobType
enum JobType {
UNDEFINED
IMPORT_ASSETS
IMPORT_ENTRIES
DELETE_CONTENT
CREATE_ENVIRONMENT
SYNC_ENVIRONMENTS
DELETE_ENVIRONMENT
DELETE_FIELD
CLONE_SITE
EXPORT_ENTRIES
DELETE_FILTERED_CONTENT
}
JobStatus
enum JobStatus {
UNDEFINED
ACTIVE
DELETED
PENDING
FAILED
COMPLETED
}
RecurringJobType
enum RecurringJobType {
UNDEFINED
SYNC
}
RecurringJobStatus
enum RecurringJobStatus {
UNDEFINED
ACTIVE
DELETED
SCHEDULED
FAILED
COMPLETED
PAUSED
}
JobTaskType
enum JobTaskType {
PROCESS_ASSETS
PROCESS_ENTRIES
DELETE_CONTENT
CLONE_DATABASE_DATA
CLONE_INDEXED_DATA
SYNC_DATABASE_DATA
SYNC_INDEXED_DATA
DELETE_ENVIRONMENT_DATA
REINDEX_ENTRIES
REINDEX_URLS
REINDEX_TAGS
REINDEX_GENERAL
CLONE_SITE
}
JobTaskStatus
enum JobTaskStatus {
ACTIVE
FAILED
PENDING
COMPLETED
}
Inputs and Payloads
For import jobs that require files (Import Assets and Import Entries jobs), the file uploading should be handled on the client side after the Jobs GraphQL API generates presigned URLs via the initFileUpload
mutation. And it should be done before creating the job. These presigned URLs can be used for multipart uploads directly from the client. The Jobs GraphQL API does not handle the file uploading itself, it only provides the necessary presigned URLs for clients to upload files to storage.
After the multipart upload is finished on the client side, completeMultiPartUpload
mutation should be used to notify the Jobs GraphQL API that the file has been uploaded and after that you can create the import job.
InitFileUploadInput
input InitFileUploadInput {
name: String!
size: Long!
}
InitFileUploadPayload
type InitFileUploadPayload {
preSignedUrls: [PresignedUrlPart!]!
uploadId: String!
key: String!
}
CompleteMultipartUploadInput
input CompleteMultipartUploadInput {
uploadId: String!
key: String!
multiPartUploads: [MultipartUploadEtagModelInput!]!
}
CompleteMultipartUploadPayload
type CompleteMultipartUploadPayload {
success: Boolean!
}
CreateImportJobInput
input CreateImportJobInput {
jobName: String!
fileKey: String
jobType: JobType!
scheduledTime: DateTime
}
CreateImportJobPayload
type CreateImportJobPayload {
id: String!
name: String!
type: JobType!
fileKey: String
status: JobStatus!
tasks: [JobTask!]!
createdAt: DateTime!
updatedAt: DateTime
createdBy: String!
updatedBy: String!
}
CreateExportJobInput
input CreateExportJobInput {
jobName: String!
filter: String
jobType: JobType!
}
CreateExportJobPayload
type CreateExportJobPayload {
id: String!
name: String!
type: JobType!
fileKey: String
status: JobStatus!
tasks: [JobTask!]!
createdAt: DateTime!
updatedAt: DateTime
createdBy: String!
updatedBy: String!
}
CreateDeleteJobInput
input CreateDeleteJobInput {
jobName: String!
scheduledTime: DateTime
siteId: String
model: String
fieldAlias: String
isDeleteFieldJob: Boolean
deleteContentTaskTypes: [DeleteContentTaskTypes!]
}
CreateDeleteJobPayload
type CreateDeleteJobPayload {
id: String!
name: String!
type: JobType!
status: JobStatus!
tasks: [JobTask!]!
createdAt: DateTime!
updatedAt: DateTime
createdBy: String!
updatedBy: String!
}
CreateEnvironmentJobInput
input CreateEnvironmentJobInput {
jobName: String!
baseEnvironmentId: String!
newEnvironmentAlias: String!
newEnvironmentDisplayName: String!
scheduledTime: DateTime
}
CreateEnvironmentJobPayload
type CreateEnvironmentJobPayload {
id: String!
name: String!
type: JobType!
fileKey: String
status: JobStatus!
recurringJobId: String
tasks: [JobTask!]!
createdAt: DateTime!
updatedAt: DateTime
createdBy: String!
updatedBy: String!
}
SyncEnvironmentsJobInput
input SyncEnvironmentsJobInput {
jobName: String!
baseEnvironmentId: String!
targetEnvironmentId: String!
cronExpression: String
}
SyncEnvironmentsJobPayload
type SyncEnvironmentsJobPayload {
job: Job
recurringJob: RecurringJob
}
DeleteEnvironmentJobInput
input DeleteEnvironmentJobInput {
jobName: String!
environmentId: String!
}
DeleteEnvironmentJobPayload
type DeleteEnvironmentJobPayload {
id: String!
name: String!
type: JobType!
fileKey: String
status: JobStatus!
tasks: [JobTask!]!
createdAt: DateTime!
updatedAt: DateTime
createdBy: String!
updatedBy: String!
}
RunRecurringJobInput
input RunRecurringJobInput {
recurringJobId: String!
}
RunRecurringJobPayload
type RunRecurringJobPayload {
id: String!
name: String!
cronExpression: String
type: RecurringJobType!
status: RecurringJobStatus!
attrs: [KeyValuePairOfStringAndObject!]
createdAt: DateTime!
updatedAt: DateTime
createdBy: String!
updatedBy: String!
}
DeleteRecurringJobInput
input DeleteRecurringJobInput {
recurringJobId: String!
}
DeleteRecurringJobPayload
type DeleteRecurringJobPayload {
success: Boolean!
recurringJobId: String!
}
ToggleRecurringJobStatusInput
input ToggleRecurringJobStatusInput {
recurringJobId: String!
pause: Boolean!
}
ToggleRecurringJobStatusPayload
type ToggleRecurringJobStatusPayload {
success: Boolean!
}
CreateCloneSiteJobInput
input CreateCloneSiteJobInput {
jobName: String!
baseSiteId: String!
newSiteAlias: String!
newSiteName: String!
newSiteLogo: String!
}
CreateCloneSiteJobPayload
type CreateCloneSiteJobPayload {
id: String!
name: String!
type: JobType!
status: JobStatus!
tasks: [JobTask!]!
createdAt: DateTime!
updatedAt: DateTime
createdBy: String!
updatedBy: String
}
Example Queries
Below are some example queries and mutations to help you get started with the Codex Jobs GraphQL API.
Create an Import Job
mutation {
createImportJob(createJobInput: {
jobName: "Import Assets"
fileKey: "assets/file.zip"
jobType: IMPORT_ASSETS
scheduledTime: "2024-05-14T10:00:00Z"
}) {
id
name
type
fileKey
status
tasks {
id
consoleAppName
displayName
status
type
runAt
}
createdAt
updatedAt
createdBy
updatedBy
}
}
Create a Delete Job
mutation {
createDeleteJob(createJobInput: {
jobName: "Delete Content"
scheduledTime: "2024-05-14T12:00:00Z"
siteId: "site123"
model: "Article"
fieldAlias: "category"
isDeleteFieldJob: false
filter: "{\"system\":{\"SiteId\":{\"In\":[\"your_site_id\"],\"_t\":\"StringFilter\"},\"ModelId\":{\"In\":[\"your_model_id\"],\"_t\":\"StringFilter\"},\"_t\":\"EntrySystemFilter\"}}"
deleteContentTaskTypes: [DELETE_ENTRIES]
}) {
id
name
type
status
tasks {
id
consoleAppName
displayName
status
type
runAt
}
createdAt
updatedAt
createdBy
updatedBy
}
}
Run a Recurring Job
mutation {
runRecurringJob(createJobInput: {
recurringJobId: "recurring123"
}) {
id
name
cronExpression
type
status
attrs {
key
value
}
createdAt
updatedAt
createdBy
updatedBy
}
}
Toggle Recurring Job Status
mutation {
toggleRecurringJobStatus(toggleRecurringJobInput: {
recurringJobId: "recurring123"
pause: true
}) {
success
}
}
Delete a Recurring Job
mutation {
deleteRecurringJob(deleteRecurringJobInput: {
recurringJobId: "recurring123"
}) {
success
recurringJobId
}
}
Create an Export Job
mutation {
createExportJob(createJobInput: {
filter: "{\"system\":{\"_t\":\"StringFilter\"},\"ModelId\":{\"In\":[\"your_model_id\"],\"_t\":\"StringFilter\"},\"_t\":\"EntrySystemFilter\"}}",
jobName: "Export Entries",
jobType: EXPORT_ENTRIES
}) {
id
name
type
status
tasks {
id
consoleAppName
displayName
status
type
runAt
}
createdAt
updatedAt
createdBy
updatedBy
}
}