Azure App Service - Could not find file 'C:\home\site\wwwroot\doconut-logs'

The viewer is working locally and when hosted in our QA app service, but when trying to load in production, we’re getting an error Could not find file 'C:\home\site\wwwroot\doconut-logs' loading DocImage.axd.

I had a similar issue with DoconutPngExportPath - I had to set the path to C:\home\site\doconut-export because it was trying to use C:\home\site\wwwroot\doconut-export and failing, even though we’re actually using blob storage to store the generated images.

viewer.ExportToCloud( cloudConfig );

and our web.config is configured to use:

<add name="DocImage" verb="GET,POST" path="DocImage.axd" type="Doconut.Clouds.AzureStorageImageHandler, Doconut.Clouds" />

I think it’s checking if the path exists and throwing an error, even if it’s not actually saving anything (correct me if I’m wrong).

Is there a way to get whatever these logs are to store as a blob too or is there a way to just turn that off or at least change the path?

@AlanWaiss,

I attached a working project example with Azure storage config. Please check the assembly references in your project and compare it with the web.config file sample.

GenericCloudSample.zip (725.9 KB)

At the moment, the doconut-logs folder is not customizable. The development team will add the option to use a custom path for doconut logs in the next version. Please verify the web server has permission to write in the doconut-logs folder.

On the web.config, you can use the DoconutLogErrors value to deactivate the log.

<add key="DoconutLogErrors" value="false" />

OK, I see the new release added “A custom error log path can now be configured in Doconut Clouds improving log management and system monitoring.” - how do I do this?

@AlanWaiss,

To configure a custom error log path, you need to add the following entry to your web.config file:

<add key="DoconutLogErrorPath" value="CUSTOM-PATH" />

If the log path is located in the same directory as the web.config file, you can use a relative path like this:

<add key="DoconutLogErrorPath" value="~/custom-path" />

However, if the log path is outside the directory containing the web.config, you must specify the full absolute path to ensure the application can correctly locate and write to the log file.

If no custom path is specified, the default log folder remains as ~/doconut-logs. Please note that the specified folder must already exist, and the IIS application pool must have the appropriate permissions to write to this directory.

If you have any further questions or need additional assistance, feel free to reach out.

We have set the DoconutLogErrors key to “false”, but some of our users are still sometimes receiving errors such as

Could not find file ‘C:\home\site\wwwroot\doconut-logs\Azure-Storage-2bbb61e1-066f-4da0-aac9-6ae63baf15eb.txt’.

It does not happen every time, though. My best theory is that the initial request that generates the images is happening in one instance of the app service, but when loading the DocImage.axd path, that request is going to a different instance and the file wouldn’t exist.

Is the handler looking for a log file even when logging is turned off?

@AlanWaiss,

Thank you for reporting this.

Although logging is set to “false” and we are still receiving the error log, I would like to gather more details to better understand the situation. Some questions I have are:

Could you confirm if your application is running on a webfarm?
Could you attach your web.config file without including sensitive information?

If you have more specific steps or logs of times this issue occurs, sharing them might also help reproduce and diagnose the behavior. Thank you for your help.

Webfarm - yes. We’re running in an Azure App Service.

The doconut settings in web.config:

<configuration>
  <appSettings>
    <add key="DoconutAzureStorageConnectionString" value="Secret" />
    <add key="DoconutAzureContainerName" value="doconut" />
    <add key="DoconutLogErrors" value="false" />
    <add key="DoconutPngExportPath" value="C:\home\site\doconut-export" />
    <add key="DoconutPageWaitTimeSeconds" value="10" />
    <add key="DoconutStartWaitFromPage" value="5" />
  </appSettings>
  <system.webServer>
    <handlers>
      <add name="DocImage" verb="GET,POST" path="DocImage.axd" type="Doconut.Clouds.AzureStorageImageHandler, Doconut.Clouds" />
	</handlers>
  </system.webServer>
</configuration>

Doconut Log Issue.zip (5.0 KB)

@AlanWaiss,

Thank you for providing the requested information. Based on the details you have shared, I will attempt to replicate the issue using the same configuration.

And if you encounter any additional information to help us to fix this issue, please feel free to share it here.

I will keep you updated about the progress. Thank you for your cooperation.

I actually think you’ll have trouble replicating it - we have trouble replicating it. Is my theory valid that you’re saving a file to the local file system then looking for that file in the DocImage.axd handler?

If so, is there a way to not do that or to save that data somewhere else, like blob storage?

@AlanWaiss,

Thank you for the additional information. Indeed, log files in Doconut are saved locally on the server. By default, these are stored in a folder called ‘doconut-logs’, located in the root of the system, although it is also possible to configure a custom path according to your needs.

Regarding your query about the possibility of saving data in external storage such as blob storage, Doconut currently does not have a native option to redirect these logs to cloud storage services. The current design focuses on local file management to optimize performance and accessibility from the server itself.

Our development team continues to investigate this issue.

I’ve been experimenting with mounting a blob storage account and using the path for doconut logs. Initial tests seem to be working OK. It seems to be writing a log file for each preview.

What I’m seeing in each file is

GetConfigFromCacheDisk: Config path not found C:\home\site\QA\doconut-export\Doconut-Cloud-Config.json....Token : ccbe7ed5-6f8e-4806-82bd-f066a13c069c....Stack........

Why is it looking for a Doconut-Cloud-Config.json?

I know that file storage has different costs than pure blobs and there’s additional configuration involved, so I would still like a way to configure blob storage directly without needing to mimic a file path.

@AlanWaiss,

Thank you for your insights into using Blob Storage for Doconut logs. I’ve been investigating and testing the issue you’re encountering in detail. Here are some details in config to adjust the configuration to resolve it

The Viewer class has a method called SaveCloudConfig(CloudUploadConfig uploadConfig). This handles two main options:

  1. Saving to cache: SaveConfigToCache – This option stores the configuration in the server’s memory cache.
  2. Saving to disk: SaveConfigToDisk – This option writes the configuration to a physical file called Doconut-Cloud-Config.json on the server’s path.

In this case, since you are using blob storage and don’t require a local file on disk, I suggest setting SaveConfigToCache = true and SaveConfigToDisk = false. This way, the exception related to looking for the Doconut-Cloud-Config.json file will no longer occur, and the configuration will always stay in the cache.

Additionally, it’s important to set a sufficient cache expiration time using the uploadConfig.ConfigCacheTimeMinutes property to ensure that the configuration remains available for your application as needed.

It would be very helpful if you could send us a demo or example of what you’ve been testing with the blob storage setup, so I can review it in detail.

Please let me know if you have any further questions, and I look forward to reviewing your demo.

I already have SaveConfigToCache set to true and SaveConfigToDisk set to false.

cloudConfig.LogErrors = true;
cloudConfig.WebfarmPath = exportPath;
cloudConfig.PerDocumentFunctions = false;
cloudConfig.SaveConfigToCache = true;
cloudConfig.SaveConfigToDisk = false;
cloudConfig.ConfigCacheTimeMinutes = 360;

Actually, that might be a problem source. Wouldn’t application cache be local to the service instance? Therefore if the subsequent requests go to a different service instance, the config wouldn’t be in the cache of that instance.

@AlanWaiss,

The issue you mentioned is correct. In a multi-instance service environment, each instance has its own local cache. Therefore, if a request is made to one instance and another request is made to a different instance, the configuration stored in the local cache of the first instance will not be available in the second, leading to an error when the configuration cannot be found.

To resolve this problem, I recommend the following solution:

  • Use the SaveCloudConfig method, passing the CloudUploadConfig object with all the necessary configuration.
  • The SaveConfigToDisk property should be set to true on all instances.
  • When this property is true, the method generates the path for the Doconut-Cloud-Config.json file within the specified directory (WebfarmPath property).

This approach ensures that the configuration persists on disk and is available in all instances.