Bulk API v2 Get Job Status Failing - InvalidBatch : Field name not found The Next CEO of Stack Overflow2019 Community Moderator ElectionWhy is my WebToLead form not submitting?Bulk-API: How to add a batch with both CSV data and a query?API Call out to a Vendor Product from Batch JobRetrieving status from bulk job load?Publish Status Rest Api call Fails with Not FoundProcess Builder - Clone - Deleted Or Renamed Field (API Name)Bulk Api error InvalidBatch : Field name not foundCreating a new sandbox using the Tooling API with .NETCan I get assistance on how to set up Bulk/Batch insert?Trouble with Creating a Job for Bulk API 2.0

Why do remote companies require working in the US?

WOW air has ceased operation, can I get my tickets refunded?

When airplanes disconnect from a tanker during air to air refueling, why do they bank so sharply to the right?

What makes a siege story/plot interesting?

India just shot down a satellite from the ground. At what altitude range is the resulting debris field?

Any way to transfer all permissions from one role to another?

ls Ordering[Ordering[list]] optimal?

What is the purpose of the Evocation wizard's Potent Cantrip feature?

Why did we only see the N-1 starfighters in one film?

Was a professor correct to chastise me for writing "Prof. X" rather than "Professor X"?

Whats the best way to handle refactoring a big file?

Why is Miller's case titled R (Miller)?

Can the Reverse Gravity spell affect the Meteor Swarm spell?

How to write the block matrix in LaTex?

Only print output after finding pattern

If the heap is initialized for security, then why is the stack uninitialized?

Customer Requests (Sometimes) Drive Me Bonkers!

How should I support this large drywall patch?

How to make a software documentation "officially" citable?

Visit to the USA with ESTA approved before trip to Iran

Removing read access from a file

How to be diplomatic in refusing to write code that breaches the privacy of our users

Crossing the line between justified force and brutality

Describing a person. What needs to be mentioned?



Bulk API v2 Get Job Status Failing - InvalidBatch : Field name not found



The Next CEO of Stack Overflow
2019 Community Moderator ElectionWhy is my WebToLead form not submitting?Bulk-API: How to add a batch with both CSV data and a query?API Call out to a Vendor Product from Batch JobRetrieving status from bulk job load?Publish Status Rest Api call Fails with Not FoundProcess Builder - Clone - Deleted Or Renamed Field (API Name)Bulk Api error InvalidBatch : Field name not foundCreating a new sandbox using the Tooling API with .NETCan I get assistance on how to set up Bulk/Batch insert?Trouble with Creating a Job for Bulk API 2.0










4















I've been beating my head against the wall of the Salesforce Bulk API v2.0 for a couple of weeks now and I'm at my wits end.



I've been able to use Postman to test most of the calls and they appear to be working with correct values being returned. I can login, create a job, upload a file, close the job, and check the status all in Postman and it works correctly.



I can duplicate this code in C# and it all works as expected until I try to check the status of the job. At that point it returns this error, along with the normal fields.



"InvalidBatch : InvalidBatch : Field name not found : ", there is no field name listed as I have seen on other posts.



I've double checked the session token, the url, the http method, and the headers. Everything appears to be normal and matches what I'm using in Postman with the obvious values replaced.



This is the url I'm calling to return the job status, https://instance.my.salesforce.com/services/data/v45.0/jobs/ingest/jobID. This matches the documentation and all samples I've found.



Here is the JSON returned from this call, job id, created by, object have been removed for this post.




"operation": "insert",
"createdDate": "2019-03-27T19:08:46.000+0000",
"systemModstamp": "2019-03-27T19:08:52.000+0000",
"state": "Failed",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 0,
"numberRecordsFailed": 0,
"retries": 0,
"totalProcessingTime": 0,
"apiActiveProcessingTime": 0,
"apexProcessingTime": 0,
"errorMessage": "InvalidBatch : InvalidBatch : Field name not found : --e52c6b94-9d37-4eda-b096-59d96b7e0cb5"



I've double, triple checked the column names I'm sending in the CSV file header record, they all match. Plus if they didn't, then Postman would also fail.



Our resident internet guru is also at a loss as to why this isn't working. Here is the code I'm using to return the job status.



client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")));
Task<HttpResponseMessage> responseMessage = client.SendAsync(request);
Task<string> response = responseMessage.GetAwaiter().GetResult().Content.ReadAsStringAsync();


I've been through the Saleforce documents, tutorials, and many other C# examples. I haven't been able to determine the issue.



I've searched the internet high and low and I can't find anything resembling this issue.



The only thing I can think of is the Salesforce consultant wanted me to embed a CRLF into our address lines so the complete street address would be in one field. But I don't know why that would be ok when using Postman but not ok when running from code.



If any one has any ideas, I'd really appreciate it.



Edit: I can confirm is it not the CRLF embedded in the street address. I've swapped between CR, LF, CRLF, or no character at all, no difference.



Edit: To be perfectly clear the job is being flagged as failed as soon as I close the job. To do that call a PATCH with a body of "state" : "UploadComplete" .










share|improve this question









New contributor




Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




















  • Can you print and see whats the value of BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")

    – Pranay Jaiswal
    3 hours ago












  • @pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….

    – Steve D.
    2 hours ago















4















I've been beating my head against the wall of the Salesforce Bulk API v2.0 for a couple of weeks now and I'm at my wits end.



I've been able to use Postman to test most of the calls and they appear to be working with correct values being returned. I can login, create a job, upload a file, close the job, and check the status all in Postman and it works correctly.



I can duplicate this code in C# and it all works as expected until I try to check the status of the job. At that point it returns this error, along with the normal fields.



"InvalidBatch : InvalidBatch : Field name not found : ", there is no field name listed as I have seen on other posts.



I've double checked the session token, the url, the http method, and the headers. Everything appears to be normal and matches what I'm using in Postman with the obvious values replaced.



This is the url I'm calling to return the job status, https://instance.my.salesforce.com/services/data/v45.0/jobs/ingest/jobID. This matches the documentation and all samples I've found.



Here is the JSON returned from this call, job id, created by, object have been removed for this post.




"operation": "insert",
"createdDate": "2019-03-27T19:08:46.000+0000",
"systemModstamp": "2019-03-27T19:08:52.000+0000",
"state": "Failed",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 0,
"numberRecordsFailed": 0,
"retries": 0,
"totalProcessingTime": 0,
"apiActiveProcessingTime": 0,
"apexProcessingTime": 0,
"errorMessage": "InvalidBatch : InvalidBatch : Field name not found : --e52c6b94-9d37-4eda-b096-59d96b7e0cb5"



I've double, triple checked the column names I'm sending in the CSV file header record, they all match. Plus if they didn't, then Postman would also fail.



Our resident internet guru is also at a loss as to why this isn't working. Here is the code I'm using to return the job status.



client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")));
Task<HttpResponseMessage> responseMessage = client.SendAsync(request);
Task<string> response = responseMessage.GetAwaiter().GetResult().Content.ReadAsStringAsync();


I've been through the Saleforce documents, tutorials, and many other C# examples. I haven't been able to determine the issue.



I've searched the internet high and low and I can't find anything resembling this issue.



The only thing I can think of is the Salesforce consultant wanted me to embed a CRLF into our address lines so the complete street address would be in one field. But I don't know why that would be ok when using Postman but not ok when running from code.



If any one has any ideas, I'd really appreciate it.



Edit: I can confirm is it not the CRLF embedded in the street address. I've swapped between CR, LF, CRLF, or no character at all, no difference.



Edit: To be perfectly clear the job is being flagged as failed as soon as I close the job. To do that call a PATCH with a body of "state" : "UploadComplete" .










share|improve this question









New contributor




Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




















  • Can you print and see whats the value of BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")

    – Pranay Jaiswal
    3 hours ago












  • @pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….

    – Steve D.
    2 hours ago













4












4








4


1






I've been beating my head against the wall of the Salesforce Bulk API v2.0 for a couple of weeks now and I'm at my wits end.



I've been able to use Postman to test most of the calls and they appear to be working with correct values being returned. I can login, create a job, upload a file, close the job, and check the status all in Postman and it works correctly.



I can duplicate this code in C# and it all works as expected until I try to check the status of the job. At that point it returns this error, along with the normal fields.



"InvalidBatch : InvalidBatch : Field name not found : ", there is no field name listed as I have seen on other posts.



I've double checked the session token, the url, the http method, and the headers. Everything appears to be normal and matches what I'm using in Postman with the obvious values replaced.



This is the url I'm calling to return the job status, https://instance.my.salesforce.com/services/data/v45.0/jobs/ingest/jobID. This matches the documentation and all samples I've found.



Here is the JSON returned from this call, job id, created by, object have been removed for this post.




"operation": "insert",
"createdDate": "2019-03-27T19:08:46.000+0000",
"systemModstamp": "2019-03-27T19:08:52.000+0000",
"state": "Failed",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 0,
"numberRecordsFailed": 0,
"retries": 0,
"totalProcessingTime": 0,
"apiActiveProcessingTime": 0,
"apexProcessingTime": 0,
"errorMessage": "InvalidBatch : InvalidBatch : Field name not found : --e52c6b94-9d37-4eda-b096-59d96b7e0cb5"



I've double, triple checked the column names I'm sending in the CSV file header record, they all match. Plus if they didn't, then Postman would also fail.



Our resident internet guru is also at a loss as to why this isn't working. Here is the code I'm using to return the job status.



client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")));
Task<HttpResponseMessage> responseMessage = client.SendAsync(request);
Task<string> response = responseMessage.GetAwaiter().GetResult().Content.ReadAsStringAsync();


I've been through the Saleforce documents, tutorials, and many other C# examples. I haven't been able to determine the issue.



I've searched the internet high and low and I can't find anything resembling this issue.



The only thing I can think of is the Salesforce consultant wanted me to embed a CRLF into our address lines so the complete street address would be in one field. But I don't know why that would be ok when using Postman but not ok when running from code.



If any one has any ideas, I'd really appreciate it.



Edit: I can confirm is it not the CRLF embedded in the street address. I've swapped between CR, LF, CRLF, or no character at all, no difference.



Edit: To be perfectly clear the job is being flagged as failed as soon as I close the job. To do that call a PATCH with a body of "state" : "UploadComplete" .










share|improve this question









New contributor




Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












I've been beating my head against the wall of the Salesforce Bulk API v2.0 for a couple of weeks now and I'm at my wits end.



I've been able to use Postman to test most of the calls and they appear to be working with correct values being returned. I can login, create a job, upload a file, close the job, and check the status all in Postman and it works correctly.



I can duplicate this code in C# and it all works as expected until I try to check the status of the job. At that point it returns this error, along with the normal fields.



"InvalidBatch : InvalidBatch : Field name not found : ", there is no field name listed as I have seen on other posts.



I've double checked the session token, the url, the http method, and the headers. Everything appears to be normal and matches what I'm using in Postman with the obvious values replaced.



This is the url I'm calling to return the job status, https://instance.my.salesforce.com/services/data/v45.0/jobs/ingest/jobID. This matches the documentation and all samples I've found.



Here is the JSON returned from this call, job id, created by, object have been removed for this post.




"operation": "insert",
"createdDate": "2019-03-27T19:08:46.000+0000",
"systemModstamp": "2019-03-27T19:08:52.000+0000",
"state": "Failed",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 0,
"numberRecordsFailed": 0,
"retries": 0,
"totalProcessingTime": 0,
"apiActiveProcessingTime": 0,
"apexProcessingTime": 0,
"errorMessage": "InvalidBatch : InvalidBatch : Field name not found : --e52c6b94-9d37-4eda-b096-59d96b7e0cb5"



I've double, triple checked the column names I'm sending in the CSV file header record, they all match. Plus if they didn't, then Postman would also fail.



Our resident internet guru is also at a loss as to why this isn't working. Here is the code I'm using to return the job status.



client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Get, BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")));
Task<HttpResponseMessage> responseMessage = client.SendAsync(request);
Task<string> response = responseMessage.GetAwaiter().GetResult().Content.ReadAsStringAsync();


I've been through the Saleforce documents, tutorials, and many other C# examples. I haven't been able to determine the issue.



I've searched the internet high and low and I can't find anything resembling this issue.



The only thing I can think of is the Salesforce consultant wanted me to embed a CRLF into our address lines so the complete street address would be in one field. But I don't know why that would be ok when using Postman but not ok when running from code.



If any one has any ideas, I'd really appreciate it.



Edit: I can confirm is it not the CRLF embedded in the street address. I've swapped between CR, LF, CRLF, or no character at all, no difference.



Edit: To be perfectly clear the job is being flagged as failed as soon as I close the job. To do that call a PATCH with a body of "state" : "UploadComplete" .







api c# bulk






share|improve this question









New contributor




Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|improve this question









New contributor




Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|improve this question




share|improve this question








edited 2 hours ago







Steve D.













New contributor




Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked 4 hours ago









Steve D.Steve D.

514




514




New contributor




Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.












  • Can you print and see whats the value of BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")

    – Pranay Jaiswal
    3 hours ago












  • @pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….

    – Steve D.
    2 hours ago

















  • Can you print and see whats the value of BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")

    – Pranay Jaiswal
    3 hours ago












  • @pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….

    – Steve D.
    2 hours ago
















Can you print and see whats the value of BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")

– Pranay Jaiswal
3 hours ago






Can you print and see whats the value of BulkEndPoint + ContentUrl.Substring(0, ContentUrl.IndexOf("/batches")

– Pranay Jaiswal
3 hours ago














@pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….

– Steve D.
2 hours ago





@pranay-jaiswal, it's the same as the link I posted in the question. I double checked this just now. ag.my.salesforce.com/services/data/v45.0/jobs/ingest/….

– Steve D.
2 hours ago










1 Answer
1






active

oldest

votes


















3














Problem resolved!




"operation": "insert",
"createdDate": "2019-03-28T19:40:38.000+0000",
"systemModstamp": "2019-03-28T19:41:34.000+0000",
"state": "JobComplete",
"concurrencyMode": "Parallel",
"contentType": "CSV",
"apiVersion": 45.0,
"jobType": "V2Ingest",
"lineEnding": "CRLF",
"columnDelimiter": "COMMA",
"numberRecordsProcessed": 883,
"numberRecordsFailed": 809,
"retries": 0,
"totalProcessingTime": 11210,
"apiActiveProcessingTime": 10847,
"apexProcessingTime": 0



Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.



This was my original code.



MultipartFormDataContent content = new MultipartFormDataContent()

Headers =

ContentType = new MediaTypeHeaderValue("text/csv")

;

client.DefaultRequestHeaders.Accept.Clear();

using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "application/octet-stream");
streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
content.Add(streamContent);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;



I'm guessing it didn't like the headers for the content, streamContent variables.



The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.



// We don't want any weird headers hanging around
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Add("Accept", "application/json");

using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

StreamContent streamContent = new StreamContent(fs);
streamContent.Headers.Add("Content-Type", "text/csv");
//streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
var result = responseMessage.Result.Content.ReadAsStringAsync();
string resultString = result.Result;






share|improve this answer








New contributor




Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




















    Your Answer








    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "459"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );






    Steve D. is a new contributor. Be nice, and check out our Code of Conduct.









    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f255712%2fbulk-api-v2-get-job-status-failing-invalidbatch-field-name-not-found%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3














    Problem resolved!




    "operation": "insert",
    "createdDate": "2019-03-28T19:40:38.000+0000",
    "systemModstamp": "2019-03-28T19:41:34.000+0000",
    "state": "JobComplete",
    "concurrencyMode": "Parallel",
    "contentType": "CSV",
    "apiVersion": 45.0,
    "jobType": "V2Ingest",
    "lineEnding": "CRLF",
    "columnDelimiter": "COMMA",
    "numberRecordsProcessed": 883,
    "numberRecordsFailed": 809,
    "retries": 0,
    "totalProcessingTime": 11210,
    "apiActiveProcessingTime": 10847,
    "apexProcessingTime": 0



    Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.



    This was my original code.



    MultipartFormDataContent content = new MultipartFormDataContent()

    Headers =

    ContentType = new MediaTypeHeaderValue("text/csv")

    ;

    client.DefaultRequestHeaders.Accept.Clear();

    using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

    StreamContent streamContent = new StreamContent(fs);
    streamContent.Headers.Add("Content-Type", "application/octet-stream");
    streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
    content.Add(streamContent);
    Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
    var result = responseMessage.Result.Content.ReadAsStringAsync();
    string resultString = result.Result;



    I'm guessing it didn't like the headers for the content, streamContent variables.



    The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.



    // We don't want any weird headers hanging around
    client.DefaultRequestHeaders.Accept.Clear();
    client.DefaultRequestHeaders.Add("Accept", "application/json");

    using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

    StreamContent streamContent = new StreamContent(fs);
    streamContent.Headers.Add("Content-Type", "text/csv");
    //streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
    Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
    var result = responseMessage.Result.Content.ReadAsStringAsync();
    string resultString = result.Result;






    share|improve this answer








    New contributor




    Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.
























      3














      Problem resolved!




      "operation": "insert",
      "createdDate": "2019-03-28T19:40:38.000+0000",
      "systemModstamp": "2019-03-28T19:41:34.000+0000",
      "state": "JobComplete",
      "concurrencyMode": "Parallel",
      "contentType": "CSV",
      "apiVersion": 45.0,
      "jobType": "V2Ingest",
      "lineEnding": "CRLF",
      "columnDelimiter": "COMMA",
      "numberRecordsProcessed": 883,
      "numberRecordsFailed": 809,
      "retries": 0,
      "totalProcessingTime": 11210,
      "apiActiveProcessingTime": 10847,
      "apexProcessingTime": 0



      Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.



      This was my original code.



      MultipartFormDataContent content = new MultipartFormDataContent()

      Headers =

      ContentType = new MediaTypeHeaderValue("text/csv")

      ;

      client.DefaultRequestHeaders.Accept.Clear();

      using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

      StreamContent streamContent = new StreamContent(fs);
      streamContent.Headers.Add("Content-Type", "application/octet-stream");
      streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
      content.Add(streamContent);
      Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
      var result = responseMessage.Result.Content.ReadAsStringAsync();
      string resultString = result.Result;



      I'm guessing it didn't like the headers for the content, streamContent variables.



      The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.



      // We don't want any weird headers hanging around
      client.DefaultRequestHeaders.Accept.Clear();
      client.DefaultRequestHeaders.Add("Accept", "application/json");

      using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

      StreamContent streamContent = new StreamContent(fs);
      streamContent.Headers.Add("Content-Type", "text/csv");
      //streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
      Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
      var result = responseMessage.Result.Content.ReadAsStringAsync();
      string resultString = result.Result;






      share|improve this answer








      New contributor




      Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






















        3












        3








        3







        Problem resolved!




        "operation": "insert",
        "createdDate": "2019-03-28T19:40:38.000+0000",
        "systemModstamp": "2019-03-28T19:41:34.000+0000",
        "state": "JobComplete",
        "concurrencyMode": "Parallel",
        "contentType": "CSV",
        "apiVersion": 45.0,
        "jobType": "V2Ingest",
        "lineEnding": "CRLF",
        "columnDelimiter": "COMMA",
        "numberRecordsProcessed": 883,
        "numberRecordsFailed": 809,
        "retries": 0,
        "totalProcessingTime": 11210,
        "apiActiveProcessingTime": 10847,
        "apexProcessingTime": 0



        Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.



        This was my original code.



        MultipartFormDataContent content = new MultipartFormDataContent()

        Headers =

        ContentType = new MediaTypeHeaderValue("text/csv")

        ;

        client.DefaultRequestHeaders.Accept.Clear();

        using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

        StreamContent streamContent = new StreamContent(fs);
        streamContent.Headers.Add("Content-Type", "application/octet-stream");
        streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
        content.Add(streamContent);
        Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
        var result = responseMessage.Result.Content.ReadAsStringAsync();
        string resultString = result.Result;



        I'm guessing it didn't like the headers for the content, streamContent variables.



        The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.



        // We don't want any weird headers hanging around
        client.DefaultRequestHeaders.Accept.Clear();
        client.DefaultRequestHeaders.Add("Accept", "application/json");

        using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

        StreamContent streamContent = new StreamContent(fs);
        streamContent.Headers.Add("Content-Type", "text/csv");
        //streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
        Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
        var result = responseMessage.Result.Content.ReadAsStringAsync();
        string resultString = result.Result;






        share|improve this answer








        New contributor




        Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.










        Problem resolved!




        "operation": "insert",
        "createdDate": "2019-03-28T19:40:38.000+0000",
        "systemModstamp": "2019-03-28T19:41:34.000+0000",
        "state": "JobComplete",
        "concurrencyMode": "Parallel",
        "contentType": "CSV",
        "apiVersion": 45.0,
        "jobType": "V2Ingest",
        "lineEnding": "CRLF",
        "columnDelimiter": "COMMA",
        "numberRecordsProcessed": 883,
        "numberRecordsFailed": 809,
        "retries": 0,
        "totalProcessingTime": 11210,
        "apiActiveProcessingTime": 10847,
        "apexProcessingTime": 0



        Just in case anyone else runs into this same issue. I didn't find this information anywhere, I just kept modifying the code until I got it working. The issue for me was how the CSV file was being uploaded to Salesforce.



        This was my original code.



        MultipartFormDataContent content = new MultipartFormDataContent()

        Headers =

        ContentType = new MediaTypeHeaderValue("text/csv")

        ;

        client.DefaultRequestHeaders.Accept.Clear();

        using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

        StreamContent streamContent = new StreamContent(fs);
        streamContent.Headers.Add("Content-Type", "application/octet-stream");
        streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
        content.Add(streamContent);
        Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, content);
        var result = responseMessage.Result.Content.ReadAsStringAsync();
        string resultString = result.Result;



        I'm guessing it didn't like the headers for the content, streamContent variables.



        The new code, this still uploads the file, but doesn't fail when the job is closed. And as shown above, the return JSON from the job info no longer has an error message. This contradicts the examples I was able to find about uploading files to Salesforce using the bulk API v2.0.



        // We don't want any weird headers hanging around
        client.DefaultRequestHeaders.Accept.Clear();
        client.DefaultRequestHeaders.Add("Accept", "application/json");

        using (FileStream fs = new FileStream(UploadFileType.FilePath, FileMode.Open, FileAccess.Read))

        StreamContent streamContent = new StreamContent(fs);
        streamContent.Headers.Add("Content-Type", "text/csv");
        //streamContent.Headers.Add("Content-Disposition", "form-data; name="file"; filename="" + Path.GetFileName(UploadFileType.FilePath) + """);
        Task<HttpResponseMessage> responseMessage = client.PutAsync(BulkEndPoint + ContentUrl, streamContent);
        var result = responseMessage.Result.Content.ReadAsStringAsync();
        string resultString = result.Result;







        share|improve this answer








        New contributor




        Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.









        share|improve this answer



        share|improve this answer






        New contributor




        Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.









        answered 1 hour ago









        Steve D.Steve D.

        514




        514




        New contributor




        Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.





        New contributor





        Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.






        Steve D. is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
        Check out our Code of Conduct.




















            Steve D. is a new contributor. Be nice, and check out our Code of Conduct.









            draft saved

            draft discarded


















            Steve D. is a new contributor. Be nice, and check out our Code of Conduct.












            Steve D. is a new contributor. Be nice, and check out our Code of Conduct.











            Steve D. is a new contributor. Be nice, and check out our Code of Conduct.














            Thanks for contributing an answer to Salesforce Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsalesforce.stackexchange.com%2fquestions%2f255712%2fbulk-api-v2-get-job-status-failing-invalidbatch-field-name-not-found%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Are there any AGPL-style licences that require source code modifications to be public? Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?Force derivative works to be publicAre there any GPL like licenses for Apple App Store?Do you violate the GPL if you provide source code that cannot be compiled?GPL - is it distribution to use libraries in an appliance loaned to customers?Distributing App for free which uses GPL'ed codeModifications of server software under GPL, with web/CLI interfaceDoes using an AGPLv3-licensed library prevent me from dual-licensing my own source code?Can I publish only select code under GPLv3 from a private project?Is there published precedent regarding the scope of covered work that uses AGPL software?If MIT licensed code links to GPL licensed code what should be the license of the resulting binary program?If I use a public API endpoint that has its source code licensed under AGPL in my app, do I need to disclose my source?

            2013 GY136 Descoberta | Órbita | Referências Menu de navegação«List Of Centaurs and Scattered-Disk Objects»«List of Known Trans-Neptunian Objects»

            Metrô de Los Teques Índice Linhas | Estações | Ver também | Referências Ligações externas | Menu de navegação«INSTITUCIÓN»«Mapa de rutas»originalMetrô de Los TequesC.A. Metro Los Teques |Alcaldía de Guaicaipuro – Sitio OficialGobernacion de Mirandaeeeeeee