Saturday, December 21, 2024

A Comprehensive Guide: Backing Up and Preparing to Restore a Website on a Linux VPS

When tasked with backing up a website hosted on a Linux VPS, particularly when the hosting configuration is unknown, it can seem like a daunting challenge. This guide walks you through identifying the web server, understanding the hosting configuration, creating a backup, and preparing for restoration.

Step 1: Identifying the Web Server


Before initiating a backup, the first task is to understand what web services are running on the server.

Login to the Server

Use SSH to access the VPS:


ssh root@<server-ip>


Check Running Processes

Identify the web server (e.g., Apache or Nginx) using:


ps aux | grep -E 'nginx|apache|httpd'


Example output:


root 788 0.0 0.0 121936 2220 ? Ss 09:06 0:00 nginx: master process /usr/sbin/nginx


From this, we know Nginx is running on the server.


Check Installed Packages

Confirm installed web servers:


yum list installed | grep -E 'nginx|apache'


This verifies if both Nginx and Apache are installed, though only one may be active.


Step 2: Locating the Website Files


The next step is to find where the website files are stored.


Check Web Server Configuration

Look for the root directory in the configuration files:

For Nginx:

grep -r 'root' /etc/nginx/

For Apache:

grep -r 'DocumentRoot' /etc/httpd/


Example for Nginx:


root /var/www/knights;


This indicates the files are in /var/www/knights.


Explore the Directory

Navigate to the directory and list its contents:


cd /var/www/vishrava

ls -l


Identify the Website Type

Look for specific files that indicate the framework or CMS:

WordPress: wp-config.php, wp-content/

Laravel: .env, artisan

Static Website: Files like index.html, manifest.json, static/

In this example, files like index.html, manifest.json, and static/ suggest it’s a static website or single-page application (SPA).


Step 3: Creating a Backup


Once the hosting configuration is understood, create a complete backup of the website and any additional required files.

Backup Website Files

Compress the website directory:


tar -czvf website_backup.tar.gz /var/www/knights


Include Files from Root Directory

If there are important files in the root folder:


tar -czvf complete_backup.tar.gz /var/www/knights /root/file1 /root/file2


Transfer Backup to Local Machine

Use scp to copy the backup to your local machine:


scp root@<server-ip>:/complete_backup.tar.gz /path/to/local/directory


Step 4: Monitoring Folder Capacity


To ensure disk usage remains within limits during the backup process:

Check Folder Size

Use the du command:


du -sh /var/www/vishrava


Example output:


438M /var/www/knights


Monitor Changes in Real-Time

Use the watch command to monitor folder size live:


watch -n 5 du -sh /var/www/vishrava


Find Large Files

If space is unexpectedly high, locate large files:


find /var/www/vishrava -type f -size +100M -exec ls -lh {} \;


Step 5: Preparing for Restoration


With the backup complete, prepare to restore the website if needed:

Verify Backup

Extract the backup locally to ensure it contains the required files:


tar -tzvf /path/to/local/directory/complete_backup.tar.gz


Restore Backup

To restore, place the files back in their original directory:


tar -xzvf complete_backup.tar.gz -C /


Restart Web Server

Ensure the web server is restarted to recognize the restored files:


For Nginx:

systemctl restart nginx


For Apache:

systemctl restart httpd


Backing up a website on a Linux VPS involves identifying the web server, locating the website files, and creating a reliable backup. Monitoring disk usage ensures a smooth process, while preparation for restoration ensures minimal downtime. These practices not only safeguard your data but also empower you with confidence in server management.






Tuesday, October 8, 2024

Visualize Folder Structures Like WinDirStat Using PowerShell

If you’re searching for a way to visualize folder structures similar to WinDirStat, but using PowerShell, you’re in the right place. In many production environments, installing third-party tools may not be allowed due to security or compliance policies. However, you can still gather detailed disk usage information using PowerShell. This method provides an efficient way to analyze folder sizes without installing any additional software.


Why Use PowerShell?


PowerShell is a built-in tool on Windows systems and offers powerful commands to automate and analyze tasks directly. By leveraging PowerShell, you can easily gather disk usage information and visualize folder sizes in a tabular format. This approach is not only lightweight but also perfect for production environments where the use of third-party tools is restricted.


Step-by-Step Guide: Visualising Folder Sizes with PowerShell


Here’s how you can create a script that lists folder sizes, sorted from largest to smallest, giving you a view similar to what tools like WinDirStat provide:


# Define the path to analyze

$path = "C:\"


# Get all directories and calculate their sizes

$folderSizes = Get-ChildItem -Path $path -Directory -Recurse | 

    ForEach-Object {

        try {

            $size = (Get-ChildItem -Path $_.FullName -Recurse -File -ErrorAction SilentlyContinue | Measure-Object -Property Length -Sum).Sum

            [PSCustomObject]@{

                FolderPath = $_.FullName

                SizeMB = [math]::Round($size / 1MB, 2)

                SizeGB = [math]::Round($size / 1GB, 2)

            }

        } catch {

            # Handle errors, if any

        }

    } | Sort-Object -Property SizeGB -Descending


# Display the output in a table format

$folderSizes | Format-Table -Property FolderPath, SizeMB, SizeGB -AutoSize


# Optional: Export to CSV

# $folderSizes | Export-Csv -Path "C:\folder_sizes.csv" -NoTypeInformation


Script Explanation


  •    $path: Set this variable to the drive or folder you want to analyze. For example, "C:\" scans the entire C drive.
  •    Get-ChildItem: This command retrieves all directories recursively under the specified path.
  •    ForEach-Object: Iterates through each folder, calculating its total size.
  •    [PSCustomObject]: Creates a structured object for each folder, including its path and size in MB and GB.
  •    Sort-Object: Sorts folders by size, displaying the largest first.
  •    Format-Table: Displays the output as a table directly in the PowerShell console.


Why This Script is Useful


• No Third-Party Tools Required: This method is perfect for environments where you can’t install external software.

• Efficient and Automated: PowerShell provides a flexible and powerful way to gather information and automate tasks.

• Customizable Output: You can modify the script to suit specific needs, such as exporting data to CSV for further analysis.


Tips for Running the Script


  •   Run PowerShell as Administrator: This script may require administrative permissions to access protected folders.
  •   Be Patient: Scanning large drives can take some time. The script is designed to skip over restricted folders and continue processing.
  •   Exporting the Results: You can uncomment the Export-Csv line to save the results to a CSV file, making it easier to share or analyze the data further.

Step-by-Step Guide: Upgrading Dynamics 365 CRM On-Premises

Upgrading Dynamics 365 CRM on-premises can be a straightforward process, but it’s important to follow a structured approach to ensure a successful and error-free upgrade. This guide will take you through the entire process, from preparation to verifying the upgrade, including troubleshooting any issues that arise, such as database version mismatches.


1. Pre-Upgrade Preparation


Before you begin the upgrade, it’s crucial to prepare your environment to avoid data loss and minimize downtime.


1. Backup Everything:


  • Database Backup: Backup all CRM-related databases (e.g., Org_MSCRM     and  MSCRM_Config) using SQL Server Management Studio (SSMS).
  •  CRM Application Files: Backup the CRM application files on the CRM server.
  • Customisations: Export customisations, plugins, and workflows from CRM.


2. Check the Current CRM Version:


  • Log into Dynamics CRM and navigate to Settings > About. Note the current version        (e.g., 9.0.21.8).


3. Verify System Compatibility:


  • Ensure your SQL Server, Windows Server, and CRM application are compatible with the     version you plan to upgrade to (e.g., 9.0.28.1 or 9.1.x).


2. Download the CRM Update Files


    1. Visit the Microsoft Dynamics 365 Updates Page and locate the appropriate update version    (e.g., 9.0.28.1).


    2. Download the relevant files:


  • CRM Server Update (CRM9.0-Server-KBxxxxx-ENU-amd64.exe).
  • SRS Connector (CRM9.0-Srs-KBxxxxx-ENU-amd64.exe) if you use SQL Server Reporting  Services (SSRS).


3. Apply the Update to the CRM Server


    1. Run the CRM Server Update:


  • Log into the CRM application server.
  • Run the CRM Server Update file and follow the on-screen instructions.
  • Monitor the installation process and check for any errors.


    2. Restart the Server:

  • After the update completes, restart the CRM server to apply changes.


    3. Verify the CRM Application Version:


  • Log into CRM and go to Settings > About to confirm that the version has been updated.


4. Troubleshooting Database Version Mismatch


If the application version updates successfully but the database version remains out-of-sync, follow these steps:


    1. Verify the Database Version:


  • Open SQL Server Management Studio (SSMS) on the SQL Server hosting your CRM database.
  • Run the following query on the Org_MSCRM database:


                            SELECT * FROM BuildVersion;


  •          If the version displayed (e.g., 9.0.21.8) doesn’t match the CRM application version, proceed to the next step.


    2. Ensure Proper Permissions:


  • Confirm that the CRM service account has db_owner permissions on the Org_MSCRM and MSCRM_Config databases.


    3. Use CRM Deployment Manager to Update the Database:


  • On the CRM server, open CRM Deployment Manager.
  • Expand Organizations, right-click your CRM organization, and select Update.
  • This will trigger the database schema update to synchronize with the CRM application version.


    4. Verify the Database Version Again:


  • After the update process completes, rerun the SQL query in SSMS to ensure the database version now matches the CRM application version.


5. Install the Reporting Extensions (If Using SSRS)


If your environment uses SQL Server Reporting Services (SSRS), you need to update the SRS Connector:


    1. Run the SRS Connector Installer:


  • Download and run the CRM9.0-Srs-KBxxxxx-ENU-amd64.exe file on the SQL Server where SSRS is installed.
  • Follow the prompts and enter the Report Server URL during installation.


    2. Restart SSRS and CRM Services:


  • Restart the SQL Server Reporting Services and CRM-related services to ensure everything synchronizes properly.


6. Post-Upgrade Validation


After successfully updating both the CRM application and database:


1. Verify Functionality:


  • Test basic CRM functionalities, such as accessing records, running workflows, and generating reports.
  • Ensure that all custom plugins and integrations are functioning as expected.


2. Test Reports (if using SSRS):


  • Run a sample report to verify that the SSRS integration is working correctly.


3. Review Event Viewer Logs:


  • Check the Windows Event Viewer on both the CRM and SQL servers for any errors or warnings related to Dynamics CRM.