Bulk API v2 Get Job Status Failing - InvalidBatch : Field name not found
I've been beating my head against the wall of the Salesforce Bulk API v2.0 for a couple of weeks now and I'm at my wits end.
I've been able to use Postman to test most of the calls and they appear to be working with correct values being returned. I can login, create a job, upload a file, close the job, and check the status all in Postman and it works correctly.
I can duplicate this code in C# and it all works as expected until I try to check the status of the job. At that point it returns this error, along with the normal fields.
"InvalidBatch : InvalidBatch : Field name not found : ", there is no field name listed as I have seen on other posts.
I've double checked the session token, the url, the http method, and the headers. Everything appears to be normal and matches what I'm using in Postman with the obvious values replaced.
This is the url I'm calling to return the job status, https://instance.my.salesforce.com/services/data/v45.0/jobs/ingest/jobID. This matches the documentation and all samples I've found.
Here is the JSON returned from this call, job id, created by, object have been removed for this post.
{
"operation": "insert",
"createdDate": "2019-03-27T19:08:46.000+0000",
"systemModstamp": "2019-03-27T19:08:52.000+0000",
"state": "Failed",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 0,
"numberRecordsFailed": 0,
"retries": 0,
"totalProcessingTime": 0,
"apiActiveProcessingTime": 0,
"apexProcessingTime": 0,
"errorMessage": "InvalidBatch : InvalidBatch : Field name not found : --e52c6b94-9d37-4eda-b096-59d96b7e0cb5"
}
I've double, triple checked the column names I'm sending in the CSV file header record, they all match. Plus if they didn't, then Postman would also fail.
Our resident internet guru is also at a loss as to why this isn't working. Here is the code I'm using to return the job status.
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")));
Task<HttpResponseMessage> responseMessage = client.SendAsync(request);
Task<string> response = responseMessage.GetAwaiter().GetResult().Content.ReadAsStringAsync();
I've been through the Saleforce documents, tutorials, and many other C# examples. I haven't been able to determine the issue.
I've searched the internet high and low and I can't find anything resembling this issue.
The only thing I can think of is the Salesforce consultant wanted me to embed a CRLF into our address lines so the complete street address would be in one field. But I don't know why that would be ok when using Postman but not ok when running from code.
If any one has any ideas, I'd really appreciate it.
Edit: I can confirm is it not the CRLF embedded in the street address. I've swapped between CR, LF, CRLF, or no character at all, no difference.
Edit: To be perfectly clear the job is being flagged as failed as soon as I close the job. To do that call a PATCH with a body of { "state" : "UploadComplete" }.
api c# bulk
New contributor
add a comment |
I've been beating my head against the wall of the Salesforce Bulk API v2.0 for a couple of weeks now and I'm at my wits end.
I've been able to use Postman to test most of the calls and they appear to be working with correct values being returned. I can login, create a job, upload a file, close the job, and check the status all in Postman and it works correctly.
I can duplicate this code in C# and it all works as expected until I try to check the status of the job. At that point it returns this error, along with the normal fields.
"InvalidBatch : InvalidBatch : Field name not found : ", there is no field name listed as I have seen on other posts.
I've double checked the session token, the url, the http method, and the headers. Everything appears to be normal and matches what I'm using in Postman with the obvious values replaced.
This is the url I'm calling to return the job status, https://instance.my.salesforce.com/services/data/v45.0/jobs/ingest/jobID. This matches the documentation and all samples I've found.
Here is the JSON returned from this call, job id, created by, object have been removed for this post.
{
"operation": "insert",
"createdDate": "2019-03-27T19:08:46.000+0000",
"systemModstamp": "2019-03-27T19:08:52.000+0000",
"state": "Failed",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 0,
"numberRecordsFailed": 0,
"retries": 0,
"totalProcessingTime": 0,
"apiActiveProcessingTime": 0,
"apexProcessingTime": 0,
"errorMessage": "InvalidBatch : InvalidBatch : Field name not found : --e52c6b94-9d37-4eda-b096-59d96b7e0cb5"
}
I've double, triple checked the column names I'm sending in the CSV file header record, they all match. Plus if they didn't, then Postman would also fail.
Our resident internet guru is also at a loss as to why this isn't working. Here is the code I'm using to return the job status.
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")));
Task<HttpResponseMessage> responseMessage = client.SendAsync(request);
Task<string> response = responseMessage.GetAwaiter().GetResult().Content.ReadAsStringAsync();
I've been through the Saleforce documents, tutorials, and many other C# examples. I haven't been able to determine the issue.
I've searched the internet high and low and I can't find anything resembling this issue.
The only thing I can think of is the Salesforce consultant wanted me to embed a CRLF into our address lines so the complete street address would be in one field. But I don't know why that would be ok when using Postman but not ok when running from code.
If any one has any ideas, I'd really appreciate it.
Edit: I can confirm is it not the CRLF embedded in the street address. I've swapped between CR, LF, CRLF, or no character at all, no difference.
Edit: To be perfectly clear the job is being flagged as failed as soon as I close the job. To do that call a PATCH with a body of { "state" : "UploadComplete" }.
api c# bulk
New contributor
Can you print and see whats the value ofBulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")
– Pranay Jaiswal
Mar 28 at 17:31
@pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….
– Steve D.
Mar 28 at 18:09
add a comment |
I've been beating my head against the wall of the Salesforce Bulk API v2.0 for a couple of weeks now and I'm at my wits end.
I've been able to use Postman to test most of the calls and they appear to be working with correct values being returned. I can login, create a job, upload a file, close the job, and check the status all in Postman and it works correctly.
I can duplicate this code in C# and it all works as expected until I try to check the status of the job. At that point it returns this error, along with the normal fields.
"InvalidBatch : InvalidBatch : Field name not found : ", there is no field name listed as I have seen on other posts.
I've double checked the session token, the url, the http method, and the headers. Everything appears to be normal and matches what I'm using in Postman with the obvious values replaced.
This is the url I'm calling to return the job status, https://instance.my.salesforce.com/services/data/v45.0/jobs/ingest/jobID. This matches the documentation and all samples I've found.
Here is the JSON returned from this call, job id, created by, object have been removed for this post.
{
"operation": "insert",
"createdDate": "2019-03-27T19:08:46.000+0000",
"systemModstamp": "2019-03-27T19:08:52.000+0000",
"state": "Failed",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 0,
"numberRecordsFailed": 0,
"retries": 0,
"totalProcessingTime": 0,
"apiActiveProcessingTime": 0,
"apexProcessingTime": 0,
"errorMessage": "InvalidBatch : InvalidBatch : Field name not found : --e52c6b94-9d37-4eda-b096-59d96b7e0cb5"
}
I've double, triple checked the column names I'm sending in the CSV file header record, they all match. Plus if they didn't, then Postman would also fail.
Our resident internet guru is also at a loss as to why this isn't working. Here is the code I'm using to return the job status.
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")));
Task<HttpResponseMessage> responseMessage = client.SendAsync(request);
Task<string> response = responseMessage.GetAwaiter().GetResult().Content.ReadAsStringAsync();
I've been through the Saleforce documents, tutorials, and many other C# examples. I haven't been able to determine the issue.
I've searched the internet high and low and I can't find anything resembling this issue.
The only thing I can think of is the Salesforce consultant wanted me to embed a CRLF into our address lines so the complete street address would be in one field. But I don't know why that would be ok when using Postman but not ok when running from code.
If any one has any ideas, I'd really appreciate it.
Edit: I can confirm is it not the CRLF embedded in the street address. I've swapped between CR, LF, CRLF, or no character at all, no difference.
Edit: To be perfectly clear the job is being flagged as failed as soon as I close the job. To do that call a PATCH with a body of { "state" : "UploadComplete" }.
api c# bulk
New contributor
I've been beating my head against the wall of the Salesforce Bulk API v2.0 for a couple of weeks now and I'm at my wits end.
I've been able to use Postman to test most of the calls and they appear to be working with correct values being returned. I can login, create a job, upload a file, close the job, and check the status all in Postman and it works correctly.
I can duplicate this code in C# and it all works as expected until I try to check the status of the job. At that point it returns this error, along with the normal fields.
"InvalidBatch : InvalidBatch : Field name not found : ", there is no field name listed as I have seen on other posts.
I've double checked the session token, the url, the http method, and the headers. Everything appears to be normal and matches what I'm using in Postman with the obvious values replaced.
This is the url I'm calling to return the job status, https://instance.my.salesforce.com/services/data/v45.0/jobs/ingest/jobID. This matches the documentation and all samples I've found.
Here is the JSON returned from this call, job id, created by, object have been removed for this post.
{
"operation": "insert",
"createdDate": "2019-03-27T19:08:46.000+0000",
"systemModstamp": "2019-03-27T19:08:52.000+0000",
"state": "Failed",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 0,
"numberRecordsFailed": 0,
"retries": 0,
"totalProcessingTime": 0,
"apiActiveProcessingTime": 0,
"apexProcessingTime": 0,
"errorMessage": "InvalidBatch : InvalidBatch : Field name not found : --e52c6b94-9d37-4eda-b096-59d96b7e0cb5"
}
I've double, triple checked the column names I'm sending in the CSV file header record, they all match. Plus if they didn't, then Postman would also fail.
Our resident internet guru is also at a loss as to why this isn't working. Here is the code I'm using to return the job status.
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")));
Task<HttpResponseMessage> responseMessage = client.SendAsync(request);
Task<string> response = responseMessage.GetAwaiter().GetResult().Content.ReadAsStringAsync();
I've been through the Saleforce documents, tutorials, and many other C# examples. I haven't been able to determine the issue.
I've searched the internet high and low and I can't find anything resembling this issue.
The only thing I can think of is the Salesforce consultant wanted me to embed a CRLF into our address lines so the complete street address would be in one field. But I don't know why that would be ok when using Postman but not ok when running from code.
If any one has any ideas, I'd really appreciate it.
Edit: I can confirm is it not the CRLF embedded in the street address. I've swapped between CR, LF, CRLF, or no character at all, no difference.
Edit: To be perfectly clear the job is being flagged as failed as soon as I close the job. To do that call a PATCH with a body of { "state" : "UploadComplete" }.
api c# bulk
api c# bulk
New contributor
New contributor
edited Mar 28 at 19:00
Steve D.
New contributor
asked Mar 28 at 16:21
Steve D.Steve D.
514
514
New contributor
New contributor
Can you print and see whats the value ofBulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")
– Pranay Jaiswal
Mar 28 at 17:31
@pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….
– Steve D.
Mar 28 at 18:09
add a comment |
Can you print and see whats the value ofBulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")
– Pranay Jaiswal
Mar 28 at 17:31
@pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….
– Steve D.
Mar 28 at 18:09
Can you print and see whats the value of
BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")
– Pranay Jaiswal
Mar 28 at 17:31
Can you print and see whats the value of
BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")
– Pranay Jaiswal
Mar 28 at 17:31
@pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….
– Steve D.
Mar 28 at 18:09
@pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….
– Steve D.
Mar 28 at 18:09
add a comment |
1 Answer
1
active
oldest
votes
Problem resolved!
{
"operation": "insert",
"createdDate": "2019-03-28T19:40:38.000+0000",
"systemModstamp": "2019-03-28T19:41:34.000+0000",
"state": "JobComplete",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 883,
"numberRecordsFailed": 809,
"retries": 0,
"totalProcessingTime": 11210,
"apiActiveProcessingTime": 10847,
"apexProcessingTime": 0
}
Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.
This was my original code.
MultipartFormDataContent content = new MultipartFormDataContent()
{
Headers =
{
ContentType = new MediaTypeHeaderValue("text/csv")
}
};
client.DefaultRequestHeaders.Accept.Clear();
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "application/octet-stream");
streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
content.Add(streamContent);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
I'm guessing it didn't like the headers for the content, streamContent variables.
The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.
// We don't want any weird headers hanging around
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Add("Accept", "application/json");
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "text/csv");
//streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
New contributor
add a comment |
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "459"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Steve D. is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f255712%2fbulk-api-v2-get-job-status-failing-invalidbatch-field-name-not-found%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Problem resolved!
{
"operation": "insert",
"createdDate": "2019-03-28T19:40:38.000+0000",
"systemModstamp": "2019-03-28T19:41:34.000+0000",
"state": "JobComplete",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 883,
"numberRecordsFailed": 809,
"retries": 0,
"totalProcessingTime": 11210,
"apiActiveProcessingTime": 10847,
"apexProcessingTime": 0
}
Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.
This was my original code.
MultipartFormDataContent content = new MultipartFormDataContent()
{
Headers =
{
ContentType = new MediaTypeHeaderValue("text/csv")
}
};
client.DefaultRequestHeaders.Accept.Clear();
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "application/octet-stream");
streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
content.Add(streamContent);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
I'm guessing it didn't like the headers for the content, streamContent variables.
The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.
// We don't want any weird headers hanging around
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Add("Accept", "application/json");
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "text/csv");
//streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
New contributor
add a comment |
Problem resolved!
{
"operation": "insert",
"createdDate": "2019-03-28T19:40:38.000+0000",
"systemModstamp": "2019-03-28T19:41:34.000+0000",
"state": "JobComplete",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 883,
"numberRecordsFailed": 809,
"retries": 0,
"totalProcessingTime": 11210,
"apiActiveProcessingTime": 10847,
"apexProcessingTime": 0
}
Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.
This was my original code.
MultipartFormDataContent content = new MultipartFormDataContent()
{
Headers =
{
ContentType = new MediaTypeHeaderValue("text/csv")
}
};
client.DefaultRequestHeaders.Accept.Clear();
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "application/octet-stream");
streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
content.Add(streamContent);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
I'm guessing it didn't like the headers for the content, streamContent variables.
The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.
// We don't want any weird headers hanging around
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Add("Accept", "application/json");
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "text/csv");
//streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
New contributor
add a comment |
Problem resolved!
{
"operation": "insert",
"createdDate": "2019-03-28T19:40:38.000+0000",
"systemModstamp": "2019-03-28T19:41:34.000+0000",
"state": "JobComplete",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 883,
"numberRecordsFailed": 809,
"retries": 0,
"totalProcessingTime": 11210,
"apiActiveProcessingTime": 10847,
"apexProcessingTime": 0
}
Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.
This was my original code.
MultipartFormDataContent content = new MultipartFormDataContent()
{
Headers =
{
ContentType = new MediaTypeHeaderValue("text/csv")
}
};
client.DefaultRequestHeaders.Accept.Clear();
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "application/octet-stream");
streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
content.Add(streamContent);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
I'm guessing it didn't like the headers for the content, streamContent variables.
The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.
// We don't want any weird headers hanging around
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Add("Accept", "application/json");
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "text/csv");
//streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
New contributor
Problem resolved!
{
"operation": "insert",
"createdDate": "2019-03-28T19:40:38.000+0000",
"systemModstamp": "2019-03-28T19:41:34.000+0000",
"state": "JobComplete",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 883,
"numberRecordsFailed": 809,
"retries": 0,
"totalProcessingTime": 11210,
"apiActiveProcessingTime": 10847,
"apexProcessingTime": 0
}
Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.
This was my original code.
MultipartFormDataContent content = new MultipartFormDataContent()
{
Headers =
{
ContentType = new MediaTypeHeaderValue("text/csv")
}
};
client.DefaultRequestHeaders.Accept.Clear();
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "application/octet-stream");
streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
content.Add(streamContent);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
I'm guessing it didn't like the headers for the content, streamContent variables.
The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.
// We don't want any weird headers hanging around
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Add("Accept", "application/json");
using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))
{
StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "text/csv");
//streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;
}
New contributor
New contributor
answered Mar 28 at 19:53
Steve D.Steve D.
514
514
New contributor
New contributor
add a comment |
add a comment |
Steve D. is a new contributor. Be nice, and check out our Code of Conduct.
Steve D. is a new contributor. Be nice, and check out our Code of Conduct.
Steve D. is a new contributor. Be nice, and check out our Code of Conduct.
Steve D. is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Salesforce Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f255712%2fbulk-api-v2-get-job-status-failing-invalidbatch-field-name-not-found%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Can you print and see whats the value of
BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")
– Pranay Jaiswal
Mar 28 at 17:31
@pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….
– Steve D.
Mar 28 at 18:09