VeeamON 2024 | #BounceForwardThrowback (Win Custom Shoes!)
Top technical сommunity program
Run by users, for users
Start FREE on-demand training
In the ever-evolving landscape of cybersecurity threats, safeguarding data integrity within backup systems has become paramount. Malware, in its various forms, poses a significant risk to the reliability and security of critical data backups. This article delves into the pivotal role of robust malware detection in ensuring the resilience of backup solutions, emphasizing real-time scanning techniques, periodic scans, and updates to identify and contain potential threats promptly. Understanding the Threat Landscape Direct attack of the Backup SystemsMalware, including viruses, ransomware, and other malicious software, can infiltrate backup systems through various vectors. Whether through compromised network connections, infected devices, or malicious email attachments, the threat is omnipresent.Once in the backup environment, malware can lie dormant and wait for the right moment to strike, compromising the integrity of the backup data. In the worst case, the entire backup repositories ar
Azure PortalIn order to use Immutability with Azure Blob Storage, you must setup Storage Account creation.On the “Data protection” tab of storage account creation:Disable “soft delete for blobs” Enable “Versioning for blobs” Select “Keep all versions” Disable “Version-level immutability support”Set “Blob public access” to Disabled and create a container. Give the container a name and set the “Public Access Level” to “Private”.Container-Level SettingAfter creating the Storage Account with the required settings, create a new container with the following setting:Expand the Advanced section, and Enable “Version-level immutability support”Notes:Do not configure lifecycle management policies for the container`s data.Do not enable any default immutability policy at the storage account level.After the Container is created, go to Security → Networking, Access Keys and copy the key1 key. Veeam Data Platform SettingLaunch New Object Storage Repository WizardAdd Backup Repository, Object storag
Hi guys,I need some help to export my incremental files on tape.Policies: Archiving Full backups once a week on new tapes and used tapes are exported, archiving Incremental backups on new tapes and used tapes are exported. Entity Configuration settings Media pool Full and incremental backups use different media pools. Media pool (retention) Protect data for <desired period> Media set (retention) Create new media set for every backup session Source (backup job) Forward incremental backup that runs daily with synthetic Tape job Export current media set.. on specific days Scheduling Forward incremental job running every day with weekly full backup. Tape job has been scheduled to run every day after primary backup. When I configure only Full backup archiving everything works fine, my synthetic are created the Friday night, the tape job is scheduled the Saturday morning, the job copy only the latest backup files.Now I have configured my job to copy the
Good day Community -There’s been a lot of talk around AI, especially the past several months. I’m not talking about any specific area really...just in general. AI is really gaining steam in my opinion! Even our beloved Veeam has found a useful way to implement it within its product. 🙂 But, is everyone looking to implement/integrate AI in everything really a good thing? I guess the best answer is, “it depends”. 😉As one who works in the education space, this is a topic I really am trying to wrap my brain around to learn...How best to implement pieces of AI which benefits the student population, as well as help our District be as “cutting edge” and “relevant” in the education space as possible; Where and how to put boundaries in place to prevent its abuse. ← and this here is really a huge and broad area. Where do administrators begin? What systems can be put in place to prevent abuse of AI tools? Etc.At my org, we are looking to hire a Junior Systems Engineer, and one of our interview
What is the best combination of permissions to connect kasten10 with veeam backup and replication through a service account? Example kasten-veeam-admin user Thanks!
Veeam CDP v12.1 update işleminden sonra esx lerde bulunan io filter ları güncellemenize rağmen "Failed to perform CDP components deployment Error: The operation is not allowed in the current state." hatasını alıyorsanız "C:\ProgramData\Veeam\Backup\Utils" path inde bulunan "I_O_filter_deployment" logu kontrol edilmesi gerekir. Log içerisinde 33035 portuyla ilgili hatalar ile karşılaşabilirsiniz. İlgili port veeam v12.1 ile birlikte yeni eklendi.Port requirements guide ında bulunan CDP Components bölümünü incelediğinizde alınması gereken izinleri görebilirsiniz.
After upgrade kasten 10 i received the next errors in pods: And deployments The console show me this: Other upgrade show me this:root@SIPC09074:~/kasten10# helm repo update && helm upgrade k10 kasten/k10 --namespace=kasten-io -f k10_val.yaml --version=6.5.4WARNING: Kubernetes configuration file is group-readable. This is insecure. Location: /root/.kube/configWARNING: Kubernetes configuration file is world-readable. This is insecure. Location: /root/.kube/configHang tight while we grab the latest from your chart repositories......Successfully got an update from the "kasten" chart repositoryUpdate Complete. ⎈Happy Helming!⎈WARNING: Kubernetes configuration file is group-readable. This is insecure. Location: /root/.kube/configWARNING: Kubernetes configuration file is world-readable. This is insecure. Location: /root/.kube/configError: UPGRADE FAILED: Unable to continue with update: ConfigMap "k10-eula-info" in namespace "kasten-io" exists and cannot be imported into the current
Hi,I have deployed kasten10 on OpenShift 4.13 as an operator. After that I created a route. However, i still cannot access the k10 dashboard. Regards,MLX
Lets say a periodic legacy backup copy job “LEGACY” was configured to copy 3 source jobs: JOB-A, JOB-B, JOB-CDuring the creation of the new-format backup copy job named “NEW”, only source job JOB-A was initially included. On the storage tab, target mapping was used to point to the “LEGACY” job. This triggered a format upgrade for all items inside the LEGACY job, creating a new target named “NEW”.Now I want to edit the “NEW” job to add “JOB-B” as an additonal souce.Do I need to return to the storage-tab of the job and choose target mapping and select “LEGACY” again?What happens if I do?Will it recognise that this chain has already been converted and correctly target “NEW” instead?
Today, I noticed there are two fresh vulnerabilities on the VBR12.1 Manager and console servers. Certain .net core requirements are installed when the product is installed. Unfortunately, The .net isn't patched automatically through Windows updates. CVE-2023-36049--.NET, .NET Framework, and Visual Studio Elevation of Privilege Vulnerabilityhttps://msrc.microsoft.com/update-guide/vulnerability/CVE-2023-36049This security advisory is being released by Microsoft to inform users of a vulnerability present in .NET 6.0, .NET 7.0, and .NET 8.0 RC2. Additionally, this alert offers suggestions on how developers should update their apps to fix this vulnerability.When untrusted URIs are sent to System .Net, a vulnerability in .NET allows for the elevation of privilege. It is possible to insert arbitrary commands into backend FTP servers using WebRequest.Create. CVE-2023-36558--ASP.NET Core - Security Feature Bypass Vulnerabilityhttps://msrc.microsoft.com/update-guide/vulnerability/CVE-2023-36558M
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.