Forum tip: Always check when replies were posted. Technology evolves quickly, so some answers may not be up-to-date anymore.

Comments

  • Retention Policy Question
    We do not use the “ keep # of versions” as that complicates our retention scheme.
    If we set the plan to keep 30 versions, A file that changes every day will have 30 versions in 30 days. For files that change once per month, it willl keep 30 months of versions.
    We set our retention for file backups to 90 days, with one “full” each month and daily block level backups.
    In the above example we would have 90 versions of the daily- updated file, and 3 versions of the monthly-updated file.
    Because we do “full” ( what MSP360 calls incremental) backups only once per month, it takes an extra 29 days for the oldest set of full/ block-level versions to age to 90 days. If we did a full (incremental) each week, we would only have an extra week to wait until the oldest set is purged. The trade-off is that we would be storing 12 “full” versions vs 3 and if there are large, frequently updated files ( pst, Quickbooks files, etc) it can increase your storage consumption even more than keeping an extra 21 days of block level backups.
    Confusing? Yes.
    But it has worked well for us.
    Happy to discuss further.
    -Cloud Steve
  • Backup Plan Email Notifcations - Error message causing agita
    To answer your first question: yes, please eliminate that information altogether. Include error messages for failed/warning backups, but not skipped files.
    My clients only want to know that the backup happened successfully. They don’t care about anything else. Frankly, all the rest of the info provided does nothing but elicit questions. And the “download here” link. shows what backed up successfully, but doesn’t show what failed.
    Perhaps a discussion about the best options for client email notification is in order.
  • Backblaze Synthetic Full
    Just completed a couple of tests of the V7 Synthetic full feature going to BackBlaze
    VHDx file
    Original Full (76GB) took 9 hours 5 mins (9:05) over a 1MB/s uplink.
    Subsequent Synthetic Full took only 1:38 with 4GB of changed data uploaded.
    82% reduction in runtime

    Image
    Original Full (72GB) = 8 hrs 26 mins
    Synthetic Full - 8GB uploaded - Remainder copied in cloud = 2 Hrs 15 mins
    73% reduction in runtime

    The more data that has changed since the last backup , and thus has to be uploaded, the longer the synthetic will take, but regardless, this will dramatically reduce the backup time.
    Hopefully we can now get all of our full backups completed over the weekend, rather than running for 3-4 days in some cases.
  • Backblaze Synthetic Full
    Is there a plan to allow Synthetic fulls for Local backup accounts?
  • File and Folder - Confused About Chained Backups
    Well, the short answer is, we don’t need the feature.
    We run three backups of client data each night- local, Cloud 1, and Cloud 2.
    They can all run at the same time, and sometimes do, but we typically spread the start times out.
    Unless there is a specific dependency that file x has to be backed up prior to file y, I see no reason to bother with chained backups.
    It maybe that I am missing something , so I am open to suggestions as to why I should be using it.
  • File and Folder - Confused About Chained Backups
    ”I only want the second backup to run at the completion of the first backup and never independently on its own”
    We too have separate backups/ retention periods for pst files, but do not use chained backups at all. The plans are scheduled to run at different times of the night, but even if they overlap with one another it does not cause any problems as they are backing up different files/folders.
  • New Version - Backup Agent 7.2 for Windows
    An image backup will have to restart any uncompleted partitions from the beginning.
  • New Version - Backup Agent 7.2 for Windows
    That works fine, thanks. Per my other post, we will not be reuploading 50TB of data just to get to the new format. We will use it for VHDx files and Images, where the synthetic full adds a lot of value and we only keep one version so switching formats is a no brainer.
    But please impress upon upper management that you can never deprecate the legacy format, without negatively impacting all of your MBS customer.
  • New Version - Backup Agent 7.2 for Windows
    Surprised that you would release an MBS version that does not allow editing of backup plans that are in the new format using the admin portal. We will not roll out the new format until we can manage the plans from the portal.
  • New Version - Backup Agent 7.2 for Windows
    Thanks,
    I participated in the beta and was given the indication that at some point, the only supported format would be the new one. That would be a problem. That being said, for image and HyperV/VMware backups, the new version is fantastic.
    Concerned that the banner is telling me that All of my clients are running an unsupported version.
  • New Version - Backup Agent 7.2 for Windows
    Does this require a re-upload of all data to conform to the new format? If that is the case, we will need to keep the old format indefinitily, as re-uploading everything is not viable.
  • Why Do I Have 4 Different Versions of an Outlook .PST in Amazon S3?
    For our clients in the Accounting/Tax fields we sell an "Extended Retention" option which keeps versions and deleted files in the backup location for 15 months (vs our standard of 90 days retention). We price this option under $10/month, but it has become very popular as it addresses the situation you described, and generates more revenue.
  • Backup History Search
    I apologize, I failed to specify the issue is with Backup History on the MSP portal, not the client console.
  • Backup History Search
    Individual endpoint. The search box does not appear to support wildcards, unless the "*" is not the right character for wildcard search :)
  • Moving from S3 to Wasabi
    There are ways to do migrations of S3 backups to Wasabi or BackBlaze, but they are complicated and expensive. For a 3TB client, we took a USB Hard drive MSP360 backup of the client data and brought it to our location where we have 100mbps upstream bandwidth. We then uploaded the backup to BackBlaze (took 7 days), but we were able to keep running the S3 Backups until we were done with the upload. We then connected the client's server to the BackBlaze Bucket - did a repo sync (that took over a day itself) and ran the backups. Since we had 90 days of version retention in the Amazon Bucket, we did not delete the S3 bucket until 90 days after the cutover.
    The extra cost was maintaining the two buckets for 3 monthgs plus the tech time to oversee the migration, but very low compared to a Snow/Fireball method.
  • Windows 10 Pro Desktop marked as a server.
    I find that every install defaults to the Server version. I always go back and change it to the appropriate license as part of our installation process.
  • File List With Sizes
    From the agent console storage tab, I select the “Capacity” view which sorts by folder size. Typically what chews up a lot of space are pst files, and if you keep a lot of fulls, the storage can get chewed up fast.
  • Best solution for worst case
    We don't disable the agent console on the client devices as there times that we need to utilize it.
    Here is what we do:
    1. Protect the agent/CLI with a password (as David suggested)
    2. Disable the ability to delete backups from storage from the console (it is now disabled by default in the latest version). This necessitates using Cloudberry Explorer or the BackBlaze web portal to delete unwanted backups, but it is significantly better from a security standpoint.
    3. Disable the ability to change backup/restore plans (which protects retention policies) using the console. There are rare times when we need to edit plans on the device console itself, so we change the company agent settings to allow it and push/install an updated agent on the machine. 99% of the time we edit the plans from the web portal.
    Prior to these features being implemented in MSP360, we had that worst case actually happen. What saved us is they forgot to delete one of our three backups.
  • Image Backups of Virtual Servers
    Thanks David. We did a test of Option #4 and it worked great.