I'm working on a "deep clone" Visualforce page where starting from an object, that object and its child objects and its grandparent objects and so on are cloned and that logic works fine.

But any Attachments related to these objects are also to be cloned. Given that the available heap is 6M in synchronous controller code and the body of an Attachment can be up to 5M in size, this looks awkward to handle using conventional bulk queries and updates.

I can think of a few approaches:

- First query all the Attachment.BodyLength and if that totals less than say 2M go ahead and clone directly. (The 2M cloned to set a new parent ID and clear the ID means 4M of heap used.)
- Count the number/size of the Attachments and if there are less than say 50 and all are 2M or less then query and update one at a time.
- Use a batchable either if 1 or 2 are not possible or every time (though the "Maximum number of Batch Apex jobs queued or active" of 5 is a concern there if several deep clones are done in a row).

Of those, 1 with 3 as the fallback seems a reasonable approach.

But is there a better solution to this problem?

Attribution to: Keith C

# Possible Suggestion/Solution #1

Here is some code that uses approach 2 from the original question. Thanks to the comments it avoids cloning the body and so uses heap space only marginally larger than the Attachment body size.

I guess this would be a good choice if there are a small number of large attachments. (It would need to be preceded by a count of the total number of Attachments and refuse to work if there were more than say 50 to avoid the 100 query limit and the 150 update limit.)

```
Set<Id> doneIds = new Set<Id>();
Attachment[] sobs;
do {
sobs = [
select Id, ParentId, Name, Description, Body, ContentType
from Attachment
where ParentId in :parentIds.keySet()
and Id not in :doneIds
limit 1
];
if (sobs.size() == 1) {
Attachment sob = sobs[0];
insert new Attachment(
ParentId = parentIds.get(sob.ParentId),
Name = sob.Name,
Description = sob.Description,
Body = sob.Body,
ContentType = sob.ContentType
);
doneIds.add(sob.Id);
System.debug('>>> loop heap=' + Limits.getHeapSize());
}
} while (sobs.size() == 1);
```

I may use this (2) and 3 or 1 and 3. Not decided yet.

Other thoughts very welcome.

Attribution to: Keith C

# Possible Suggestion/Solution #2

Here is the solution that I was discussing in the comments. It's very similar in concept to your's except it tries to reduce the number of DML and SOQL statements by grouping the Attachments into batches of 5MB (this will need adjusting, I just chose it to prove the concept).

The code could do with cleaning up a bit, and you'll want to change how the ParentIds are selected/assigned, but hopefully the concept is clear:

```
List<List<Id>> batches = new List<List<Id>>();
List<Integer> batchSizes = new List<Integer>();
for(Attachment attachment : [SELECT Id, BodyLength from Attachment WHERE ParentId = '5002000000ahX5f'])
{
Boolean batched = false;
for(Integer i = 0; i < batches.size(); i++)
{
Integer batchSize = batchSizes[i];
if(batchSize + attachment.BodyLength < 10000000)
{
batches[i].add(attachment.Id);
batchSizes[i] += attachment.BodyLength;
batched = true;
break;
}
}
if(!batched)
{
batches.add(new List<Id>{attachment.Id});
batchSizes.add(attachment.BodyLength);
}
System.debug('>>>first loop heap=' + Limits.getHeapSize());
}
for(List<Id> batchIds : batches)
{
List<Attachment> attachmentsToInsert = new List<Attachment>();
for(Attachment attachment : [SELECT Name, Body FROM Attachment WHERE Id IN :batchIds])
{
attachmentsToInsert.add(new Attachment(Name = attachment.Name, Body = attachment.Body, ParentId = '5002000000ahX5v'));
}
System.debug('>>> second loop heap=' + Limits.getHeapSize());
insert attachmentsToInsert;
}
```

The output for an object with 5 Attachments (4.28MB, 1.23MB, 1.24MB, 3.59MB, 4.29MB) is as follows:

```
>>>first loop heap=1871
>>>first loop heap=1917
>>>first loop heap=1955
>>>first loop heap=1993
>>>first loop heap=2015
>>> second loop heap=4503453
>>> second loop heap=2589001
>>> second loop heap=4489577
>>> second loop heap=3759972
Number of SOQL queries: 5 out of 100
Number of DML statements: 4 out of 150
```

In this case it has only saved 1 DML, but with a lot of small Attachments or increasing the limit this will save a lot more.

For example, using a limit of 10MB gives you the following:

```
>>>first loop heap=1871
>>>first loop heap=1901
>>>first loop heap=1939
>>>first loop heap=1961
>>>first loop heap=1983
>>> second loop heap=9549616
>>> second loop heap=5788869
Number of SOQL queries: 3 out of 100
Number of DML statements: 2 out of 150
```

Attribution to: Alex Tennant

This content is remixed from stackoverflow or stackexchange. Please visit https://salesforce.stackexchange.com/questions/34443