Skip to main content

GIT BASH

 

  • Bash Shell:
    • Git Bash uses the Bash (Bourne Again SHell) command-line interpreter. This means you can use many of the same commands you'd find in a Linux or macOS terminal.
  • Git Integration:
    • Git Bash is tightly integrated with Git, making it easy to execute Git commands
    • Essential Commands:

      • Navigation:
        • pwd: Prints the current working directory.
        • ls: Lists files and directories in the current directory.
        • cd <directory>: Changes the current directory.
        • cd ..: Moves to the parent directory.
      • File Management:
        • mkdir <directory>: Creates a new directory.
        • touch <file>: Creates a new file.
        • rm <file>: Removes a file.
        • rmdir <directory>: Removes an empty directory.
      • Git Commands:
        • git init: Initializes a new Git repository.
        • git clone <repository URL>: Clones an existing Git repository.
        • git status: Displays the status of your working directory.
        • git add <file>: Adds a file to the staging area.
        • git commit -m "commit message": Commits changes with a message.
        • git push: Pushes changes to a remote repository.
        • git pull: Pulls changes from a remote repository.
        • git branch: Manages branches.
        • git checkout: Switches branches.

      4. Getting Started:

      • Opening Git Bash:
        • After installation, you can open Git Bash from the Windows Start menu.
        • You can also right-click on a folder and select "Git Bash Here" to open Git Bash in that folder.
      • Configuration:
        • It's a good practice to configure your Git username and email:
          • git config --global user.name "Your Name"
          • git config --global user.email "your.email@example.com"

      5. Key Advantages:

      • Unix-like Environment:
        • Provides a familiar command-line experience for developers who work with Unix-based systems.
      • Git Convenience:
        • Streamlines Git workflows by providing a dedicated environment for Git commands.
      • Versatility:
        • Allows you to use other command-line tools and utilities that are common in Unix environments.

      Tips:

      • Practice basic Bash commands to become comfortable with the command-line interface.
      • Refer to the Git documentation for detailed information on Git commands.
      • Use online resources and tutorials to learn more about Git Bash and Git.
  • Comments

    Popular posts from this blog

    session 19 Git Repository

      🔁 Steps to Create a Branch in Databricks, Pull from Git, and Merge into a Collaborative Branch Create a New Branch in Databricks: Go to the Repos tab in your workspace. Navigate to the Git-linked repo. Click the Git icon (or three dots ⋮) and choose "Create Branch." Give your branch a name (e.g., feature-xyz ) and confirm. Pull the Latest Changes from Git: With your new branch selected, click the Git icon again. Select “Pull” to bring the latest updates from the remote repository into your local Databricks environment. Make Changes & Commit: Edit notebooks or files as needed in your branch. Use the "Commit & Push" option to push changes to the remote repo. Merge into the Collaborative Branch: Switch to the collaborative branch (e.g., dev or main ) in Git or from the Databricks UI. Click "Pull & Merge" . Choose the branch you want to merge into the collaborative branch. Review the c...

    Session 18 monitering and logging - Azure Monitor , Log analytics , and job notification

     After developing the code, we deploy it into the production environment. To monitor and logging the jobs run in the real time systems in azure  we have scheduled the jobs under the workflow , we haven't created any monitoring or any matrics . After a few times, the job failed, but we don't know because we haven't set up any monitoring, and every time we can't navigate to workspace-> workflows, under runs to see to check whether the job has been successfully running or not and in real time there will be nearly 100 jobs or more jobs to run  In real time, the production support team will monitor the process. Under the workflow, there is an option called Job notification. After setting the job notification, we can set a notification to email . if we click the date and time its takes us to the notebook which is scheduled there we can able to see the error where it happens . order to see more details, we need to under Spark tab, where we have the option to view logs ( tha...

    Transformation - section 6 - data flow

      Feature from Slide Explanation ✅ Code-free data transformations Data Flows in ADF allow you to build transformations using a drag-and-drop visual interface , with no need for writing Spark or SQL code. ✅ Executed on Data Factory-managed Databricks Spark clusters Internally, ADF uses Azure Integration Runtimes backed by Apache Spark clusters , managed by ADF, not Databricks itself . While it's similar in concept, this is not the same as your own Databricks workspace . ✅ Benefits from ADF scheduling and monitoring Data Flows are fully integrated into ADF pipelines, so you get all the orchestration, parameterization, logging, and alerting features of ADF natively. ⚠️ Important Clarification Although it says "executed on Data Factory managed Databricks Spark clusters," this does not mean you're using your own Azure Databricks workspace . Rather: ADF Data Flows run on ADF-managed Spark clusters. Azure Databricks notebooks (which you trigger via an "Exe...