Func and Action in .NET

Scenario:

Func and Action example in .NET

Solution:

  • Method with Func and Action
    1.  1
       2
       3
       4
       5
       6
       7
       8
       9
      10
      11
      12
      13
      14
      15
      16
      17
      18
      19
      20
      21
      22
      23
      24
      25
      26
      27
      28
      29
      30
      31
      32
      33
      34
      private T Push(Func<HttpClient, HttpContent, HttpResponseMessage> request, Action<StreamWriter> serialize, Func<StreamReader, T> deserialize)
          {
              ..
      
                   using (var ms = new MemoryStream())
              {
                  string data;
      
                  using (var sw = new StreamWriter(ms, Encoding, 1024, true))
                  {
                      serialize(sw);
                  }
      
                  ms.Position = 0;
      
                  using (var sr = new StreamReader(ms, Encoding, true, 1024, true))
                  {
                      data = sr.ReadToEnd();
                  }
      
              ...
      
              response = request?.Invoke(httpClient, httpContent);
      
              ...
      
               ..
      
                          var info = deserialize.Invoke(streamReader);
      
                          return info;
                      }
                  }
              }
  • Call
     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
            private async Task Call()
            {
                await client.Push(
                                sw => serializer.WriteObject(sw.BaseStream, inputData),
                                sr =>
                                {
                                    var response =
                                        serializer.ReadObject(sr.BaseStream) as Response;
    
                                    return response;
                                });
            }

Use ConfigDescriptor to read config

Scenario:

Use ConfigDescriptor to read config

Solution:

  • Read the config
    1.  1
       2
       3
       4
       5
       6
       7
       8
       9
      10
      11
      12
      13
      14
      15
      16
      17
      18
      19
      20
      21
      22
      23
      24
      25
      26
      27
      28
      29
      30
      31
      32
      33
      34
      35
      36
      37
      38
      39
      40
      41
      42
      43
      44
      45
      namespace 
      {
          [Serializable]
          public class UserConfigs
          {
             public class UserConfig
              {
                  public UserConfig(string role, bool isActive)
                  {
                      Role = role;
                      IsActive = isActive;
                  }
      
                  public string Role { get; }
      
                  public bool IsActive { get; }
              }
      
              private Dictionary<string, UserConfig> _userConfigs;
      
              public void Load(string config)
              {
                  //Deserialize config section
      
                  _userConfigs = configs.Users
                      .Cast<User>()
                      .ToDictionary(u => u.Name,
                          u => new UserConfig(u.Role, u.IsActive),
                          StringComparer.InvariantCultureIgnoreCase);
              }
      
              public bool TryGetUserConfig(string name, out UserConfig userConfig)
              {
                  try
                  {
                      return _userConfigs.TryGetValue(name, out userConfig);
                  }
                  catch
                  {
                      userConfig = default(UserConfig);
                      return false;
                  }
              }
          }
      }
  • Call the config
    1. 1
      2
      3
      4
      5
      6
      7
      private static UserConfig GetConfig(string name)
      {
           if (!config.TryGetUserConfig(name, out var userConfig))
          {
          }
          return userConfig;
      }
  • Use the config
    1
    2
    3
    4
    5
    public Process ()
    {
        var config = GetConfig("users");
        var role = config.Role;
    }

Async with queue size

Scenario:

Process records asynchronously with batch 

Solution:

  • Post async all
    1.  1
       2
       3
       4
       5
       6
       7
       8
       9
      10
              public async Task PostAsync<T>(params T[] infos) where T : IData
              {
                  await Task.WhenAll(infos.Select(async d =>
                  {
                      var stopwatch = Stopwatch.StartNew();
                      await InternalPostAsync(d);
                      stopwatch.Stop();
                      log = $("{stopwatch.ElapsedMilliseconds} ms");
                  }));
              }
  • Post async to queue
    1. 1
      2
      3
      4
              public async Task PostAsync<T>(params T[] infos) where T : IData
              {
                  await Task.Run(() => _queue.AddRecord(infos as IData[]));
              }

  • Process batch in queue
     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
            public void AddRecord(IData[] record)
            {
                if (record.Length > 0)
                {
                    var size = ConfigConRequests;
    
                    for (var i = 0; i < record.Length; i += size)
                    {
                      ...
                    }
                }
            }

Retry with Task and Async

Scenario:

Post data asynchronously. If it fails retry.

Solution:

       1
       2
       3
       4
       5
       6
       7
       8
       9
      10
      11
      12
      13
      14
      15
      16
      17
      18
      19
      20
      21
      22
      23
      24
      private async Task<Tuple<bool, string>> PostAsync<T>(T data, int? retry = null)
      {
          var retryAttempt = retry ?? retryConfig;
               
          try
          {
              await Post(data).ConfigureAwait(false);
              return new Tuple<bool, string>(true, default(string));
          }
          catch (Exception e)
          {
              if (retryAttempt > 0)
              {
      #pragma warning disable 4014
                  Task.Run(async () =>
                  {
                      await Task.Delay(TimeSpan.FromMilliseconds(ConfigRetryDelay));
                      await PostAsync(data, --retryAttempt);
                  });
      #pragma warning restore 4014
              }
              return new Tuple<bool, string>(false, errorMessage);
          }
      }

GIT create a custom branch for dev (using Tortoise GIT)

Scenario:

Create a custom branch on GIT for dev, later to be pushed to main branch.

Solution:

  1. Fetch & Rebase your project
  2. Then switch to the branch(release) which you want to use as base for your custom branch.
  3. Stash save your changes.
  4. GIT Push branch from #2 to new remote branch example release-task
  5. Then switch to the branch from #4
  6. Stash Pop your changes
  7. Commit your changes
  8. Push your changes to #4

WCF Test Client

Scenario:

Use WCF Test Client to test APIs

Solution:

  1. C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\Common7\IDE\WcfTestClient.exe
  2. Right click [Add Service] -> Your WCF service endpoint
  3. Select service method
  4. Change request parameter values
  5. -->Send
  6. Response with fields received

Generate and Setup a new SSH key using PuTTY

Scenario:

Generate and Setup a new SSH key using PuTTY

Solution:

  1. https://www.chiark.greenend.org.uk/~sgtatham/putty/latest.html -> download  .msi file from package files section and install
  2. Open PuTTYgen -> Generate [generate some randomness(entropy) by moving the mouse over the blank area]. 
  3. Key is generated and displayed in public key text area. Copy this into your clipboard and add to GITHub account.
  4. Create .ssh folder in your user directory.
  5. Enter a passphrase for your SSH key. [It can be left blank] -> Save private key to a folder [extension .ppk]
  6. Store public key (by copying content) as id_rsa.pub in folder from #4.
  7. Store Private key [Open SSH] by Clicking Conversations -> Export OpenSSH Key (force new file format) as id_rsa file.
  8. By default, git looks at C:\Users\<username>\.ssh for id_rsa key.
  9. To load keys from an SSH client like PuTTY, add an environment variable [GIT_SSH] with value as "C:\Program Files\PuTTY\plink.exe"
  10. Now using command line for GIT it will use SSH agent instead of loading the id_rsa key.
  11. PowerShell -> git clone git@gitlab.com:username/repository-name.git

Generate and Setup a new SSH key using Git Bash

Scenario:

Generate and Setup a new SSH key using Git Bash

Solution:

  1. Open Git Bash.
  2. Run below command
  3. $ ssh-keygen -t rsa -b 4096 -C "email@email.com"
    > Enter a file in which to save the key (/c/Users/you/.ssh/id_rsa):[enter for default] 
    > Enter passphrase (empty for no passphrase): [Type a passphrase]
  4. Key is created by above step.
  5. Ensure the ssh-agent is running:
    1. eval $(ssh-agent -s)
  6. Add your SSH private key to the ssh-agent.
    1. ssh-add ~/.ssh/id_rsa
  7. Copy the SSH key to your clipboard.
    1. clip < ~/.ssh/id_rsa.pub
  8. Go to Git Hub -> Account -> settings -> SSH and GPG keys -> New SSH key or Add SSH key -> Titel: {custom}, key: {paste here} - >Add SSH key
  9. Change the remote url to SSH URL.
    1. On command line: git remote set-url origin git@github.com:TestUser/Test-Git-Project.git

Setup GIT for a project (GitHub)

Scenario:

Create a new Project. Save the repository on GITHub

Solution:

  1. Download https://git-scm.com/downloads and setup
  2. Create account on GitHub https://github.com/
  3. Check if is GIT tracking your project.
    1. Command Line -> Navigate to Test Git Project folder -> git status
      • If error: fatal: Not a git repository (or any of the parent directories): .git, i.e. folder is not being tracked by git.
  4. To initialize GIT
    1. Command Line -> Navigate to Test Git Project folder -> git init
  5. Github account. -> '+' icon -> 'New Repository' -> "Test Git Project" - > Create Repository'.
  6. Connect local folder to Github repository.
    1. Copy link under the title, e.g. "https://github.com/.../Test-Git-Project.git"
  7. In Command line run below commands:
  8. 1
    2
    3
    4
    git remote add origin https://github.com/mindplace/test-repo.git
    git add . [add all files to staging area]
    git commit -m "intial commit"
    git push origin master [Login]

  9. Github -> repository screen -> refresh Project
  10. Your Project/files would now be available on GITHub.

.NET Functions guide

Scenarios:

--StringSplitOptions

var data = data[i].Errors?.Select(e => e.Exception?.Message).FirstOrDefault()
                           ?.Split(new[] { '.' }, StringSplitOptions.RemoveEmptyEntries).First();

Test driven development and Mock service

Scenario:

Write a test to verify if the user can be successfully created.

Solution:

  • Create MockService as below
    1. public interface IRepoRes
      {   
          IDataRepository<T> GetDataRepository<T>(DataCommand command);
      }

       1
       2
       3
       4
       5
       6
       7
       8
       9
      10
      11
      12
      13
      14
      15
      16
      17
      18
      19
      20
      21
      22
      23
      24
      25
      26
      27
      28
      29
      30
      31
      32
      33
      34
      35
      36
      37
      38
      39
      40
      41
      42
      43
      44
      45
      46
      47
      48
      49
      50
      51
      52
      53
      54
      55
      56
      57
      58
      59
      60
      61
      62
      63
      64
      65
      66
      67
      68
      69
      70
      71
      72
      73
      74
      75
      76
      77
      78
      79
      80
      81
      82
      83
      84
      85
      86
      87
      88
      89
      90
      91
      public class MockRepo : IRepoRes
      {
          private readonly Dictionary<string, Option<object>> _info;
      
          public MockRepo(IRepoRes real)
          {
              _info = new Dictionary<string, Option<object>>(StringComparer.InvariantCultureIgnoreCase);
          }
          public IDataRepo<T> GetDataRepo<T>(Command command)
          {
              MockRepo<T> loadedInfo;
              return TryLoadInfo(command, commandKey, out loadedInfo)
                  ? Option.Some((object)loadedInfo)
                  : Option.None<object>();
          }
      
          private  TryLoadInfo<T>(DataCommand command, string key, out MockRepo<T> info)
          {
              try
              {
                  var st = new StackTrace();
      
                  var testClass =
                      st.GetFrames()
                          .Select(f => f.GetMethod())
                          .Where(m => m.GetCustomAttributes(typeof(FactAttribute), false).Length > 0)
                          .Select(m => m.DeclaringType)
      
                  var path = new[]
                  {
                      string.Format("Environment.CurrentDirectory/App_info/DataCommands/command.CommandKey/testClass == null ? "" : string.Format("{0}_", testClass.Name)key.json"),
                      string.Format("Environment.CurrentDirectory/App_info/DataCommands/command.CommandKey/key.json")
                  }.FirstOrDefault(File.Exists);
      
                  var serializer = new DataContractJsonSerializer(typeof(T[]));
      
                      var d = new List<T>();
                      using (var inputStream = new FileStream(path, FileMode.Open))
                      {
                          using (var ms = new MemoryStream())
                          {
                              //load into MemoryStream
      
                              if (typeof(T) == typeof(ExpandoObject))
                              {
                                  var objs = JsonConvert.DeserializeObject<IEnumerable<JObject>>(Encoding.Default.GetString(ms.ToArray()));
      
                                  foreach (var obj in objs)
                                  {
                                      dynamic dynObj = new ExpandoObject();
                                      var dynDict = dynObj as IDictionary<string, object>;
                                      foreach (var property in obj.Properties())
                                      {
                                          dynDict[property.Name] = obj.GetValue(property.Name);
                                      }
                                      d.Add((T)dynObj);
                                  }
                              }
                              else
                              {
                                  d.AddRange((T[])serializer.ReadObject(ms));
                              }
      
                          }
                      }
      
                      info = new MockRepo<T>();
                      info.AddData(key, d);
              }
          }
      }
      
      
      public class MockRepo<T> : IDataRepo<T>
      {
          private readonly Dictionary<string, IEnumerable<T>> _info;
             
          public void AddData(string key, IEnumerable<T> info)
          {
              _info[key] = info;
          }
      
          public IEnumerable<T> ExecuteQuery(Command command)
          {
              IEnumerable<T> infos;
      
              var key = string.Join("_", command.CommandParameters.OrderBy(p => p.Name, StringComparer.InvariantCultureIgnoreCase).Select(p => string.Format("{0}={1}", p.Name, p.Value == null ? "" : p.Value.ToString())))
                              : "__d__";
              return infos;
          }
      }
  • Create Test case

    1.  1
       2
       3
       4
       5
       6
       7
       8
       9
      10
      11
      12
      13
      using Xunit;
      
      [Fact]
      public void TestGetUsers()
      {
          var users = LoadDataFile("TestUsers\\Users.json",
              s => _searalizer.ReadObject(s) as User[]);
      
          Assert.True(userService.TryGetUsers(users, out var userInfos));
          Assert.NotNull(userInfos);
          Assert.Equal(1, userInfos[1].Id);
          Assert.Equal("Test", userInfos[1].Name);
      }

  • Users.json as below
    1
    2
    3
    4
    {
      "Id": 1,
      "Name": "Test"
    }

  • GetUsersCommand [UserId=1.json]
    1. 1
      2
      3
      4
      5
      6
      [
          {
              "UserId": 1,
              "Name": "Test"
          }
      ]

Setup Metricbeat for ELK on Docker

Scenario:

Setup Metricbeat to capture the machine metrics like cpu, memory etc and also metrics for image/containers and visualize on kibana for ELK on Docker.

Solution:

  • Create a folder called metricbeat. Inside it add below dockerfile
    1. 1
      2
      3
      4
      5
      ARG ELK_VERSION
      
      FROM docker.elastic.co/beats/metricbeat:${ELK_VERSION}
      
      WORKDIR "/usr/share/metricbeat"
  • Create file metricbeat.yml inside config folder with following content.
  • Note
    • Provide host for the docker.sock
       1
       2
       3
       4
       5
       6
       7
       8
       9
      10
      11
      12
      13
      14
      15
      16
      17
      18
      19
      metricbeat.modules:
      - module: docker
        metricsets:
          - "container"
          - "cpu"
          - "diskio"
          - "event"
          - "healthcheck"
          - "info"
          - "image"
          - "memory"
          - "network"
        hosts: ["unix:///var/run/docker.sock"]
        period: 10s
        enabled: true
      
      output.elasticsearch:
        # Array of hosts to connect to.
        hosts: ["elasticsearch:9200"]

  • In docker-compose.yml add below
  • Note:
    • mount var/run/docker.sock
    • Set privileged to true and also provide user:root for it to be able to access the logs and push the data to ES
    .....
    
     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
      metricbeat:
         build:
           context: ./metricbeat
           args:
              ELK_VERSION: $ELK_VERSION
         volumes:
          - /var/run/docker.sock:/var/run/docker.sock
          - /usr/local/bin/docker:/usr/bin/docker
          - /sys:/sys
          - type: bind
            source: ./metricbeat/config/metricbeat.yml
            target: /usr/share/metricbeat/metricbeat.yml
            read_only: true
         privileged: true
         user: root
         environment:
           - output.elasticsearch.hosts=["elasticsearch:9200"]
         networks:
           - elk
         depends_on:
           - elasticsearch
         restart: always
  • Please note X-Pack has security which on by default and so by default ES is not accessible with out creds. So to skip that still using trial license disable security by adding to  elasticsearch.yml below:
    • xpack.security.enabled: false 
  • Powershell -> docker-compose up -d --force-recreate --no-deps
  • Navigate to Kibana -> Create Index pattern for metricbeat* and then navigate to discover to see the data and visualization.

Setup Filetbeat for ELK on Docker

Scenario:

Setup FileBeat to process the logs and visualize on kibana for ELK on Docker.

Solution:

  • Create a folder called filebeat. Inside it add below dockerfile
    1. 1
      2
      3
      4
      5
      ARG ELK_VERSION
      
      FROM docker.elastic.co/beats/filebeat:${ELK_VERSION}
      
      WORKDIR "/usr/share/filebeat"
  • Create file filebeat.yml inside config folder with following content.
  • Note
    • This monitors logs for all running containers
    • Its setup index filebeat-* format on ES and also template for it with dashboard on Kibana.
       1
       2
       3
       4
       5
       6
       7
       8
       9
      10
      11
      12
      13
      14
      15
      filebeat.inputs:
      - type: log
        paths:
          - /var/lib/docker/containers/*/*.log
      
      setup.template.name: "filebeat-"
      setup.template.pattern: "filebeat-*"
      setup.dashboards.enabled: true
      
      output.elasticsearch:
         hosts: 'elasticsearch:9200'
         index: "filebeat-%{[beat.version]}-%{+yyyy.MM.dd}"
      
      setup.kibana:
        host: "kibana:5601"

  • In docker-compose.yml add below
  • Note:
    • mount lib/docker/containers
    • Set privileged to true and also provide user:root for it to be able to access the logs and push the data to ES
    .....
    
     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    filebeat:
         build:
           context: ./filebeat
           args:
              ELK_VERSION: $ELK_VERSION
         volumes:
          - /var/lib/docker/containers:/var/lib/docker/containers
          - /sys:/sys
          - type: bind
            source: ./filebeat/config/filebeat.yml
            target: /usr/share/filebeat/filebeat.yml
            read_only: true
         privileged: true
         user: root
         environment:
           - output.elasticsearch.hosts=["elasticsearch:9200"]
         networks:
           - elk
         depends_on:
           - elasticsearch
           - kibana
         restart: always
  • Please note X-Pack has security which on by default and so by default ES is not accessible with out creds. So to skip that still using trial license disable security by adding to  elasticsearch.yml below:
    • xpack.security.enabled: false 
  • Powershell -> docker-compose up -d --force-recreate --no-deps
  • Navigate to Kibana -> Discover to see the data and visualization.

Setup Heartbeat for ELK on Docker

Scenario:

Setup Heartbeat for ELK on Docker to ping the given sites to check if they are up.

Solution:

  • Create a folder called heartbeat.Inside it add below dockerfile
    1. ARG ELK_VERSION
      
      FROM docker.elastic.co/beats/heartbeat:${ELK_VERSION}
      
      WORKDIR "/usr/share/heartbeat"
      
      # Auto-Create Kibana Heartbeat Dashboard Tables 
      RUN  ./heartbeat setup --dashboards
  • Create file heartbeat.yml with following content
    1. heartbeat.monitors:
      - type: http
        schedule: '@every 60s'
        urls:
          - http://localsite
          - http://google.com
      
        check.request:
          method: GET
          headers:
              'Content-Type': 'application/json' 
        check.response:
             status: 200
             
      
      output.elasticsearch:
         hosts: 'elasticsearch:9200'
      
      setup.kibana:
        host: "elasticsearch:5601"

  • In docker-compose.yml add below
    .....
    
    heartbeat:
         build:
           context: ./heartbeat
           args:
              ELK_VERSION: $ELK_VERSION
         volumes:
          - type: bind
            source: ./heartbeat/config/heartbeat.yml
            target: /usr/share/heartbeat/heartbeat.yml
            read_only: true
         environment:
           - output.elasticsearch.hosts=["elasticsearch:9200"]
         networks:
           - elk
         depends_on:
           - elasticsearch
         restart: always
  • Please note X-Pack has security which on by default and so by default ES is not accesible with out creds. So to skip that still using trial license disable security by adding to  elasticsearch.yml below:
    • xpack.security.enabled: false 
  • Powershell -> docker-compose up -d --force-recreate --no-deps
  • Navigate to Kibana -> Uptime to see the metrics.

Sql Server - XML

Scenario:

Sql Server - Querying xml column

Solution:

  1. Query:
  2. SELECT DISTINCT Data.value('(/category/@categoryID)[1]', 'varchar(MAX)')
                    + '\'
                    + Data.value('(/category/@subCategoryID)[1]', 'varchar(MAX)')
    FROM   CategorySubCategoryXref
    WHERE  Data.value('(/category/@name)[1]', 'varchar(MAX)') = 'MyCategory'

Setup ELK stack on Docker

Scenario:

Setup ELK stack (ElasticSearch, Logstash, Kibana) on docker

Solution:

  1. Open powershell and run below commands
  2. 1
    2
    3
    4
    5
    6
    7
    8
    9
    cd C:\..\..\Software
    git clone https://github.com/deviantony/docker-elk.git
    cd .\docker-elk\
    Open docker-compose.yml and add below: (not working)
    	volumes:
      esdata:
        driver: local
    docker-compose up -d
    docker ps
      2. Navigate to 
    • ES: http://localhost:9200
    • Kibana: http://localhost:5601
    • Logstash: http://localhost:9600
       3. Docker file
    version: '3.2'
    
    	services:
    	  elasticsearch:
    		build:
    		  context: elasticsearch/
    		  args:
    			ELK_VERSION: $ELK_VERSION
    		volumes:
    		  - type: bind
    			source: ./elasticsearch/config/elasticsearch.yml
    			target: /usr/share/elasticsearch/config/elasticsearch.yml
    			read_only: true
    		  - type: volume
    			source: elasticsearch
    			target: /usr/share/elasticsearch/data   
    		ports:
    		  - "9200:9200"
    		  - "9300:9300"
    		environment:
    		  ES_JAVA_OPTS: "-Xmx256m -Xms256m"
    		  ELASTIC_PASSWORD: changeme
    		  # Use single node discovery in order to disable production mode and avoid bootstrap checks
    		  # see https://www.elastic.co/guide/en/elasticsearch/reference/current/bootstrap-checks.html
    		  discovery.type: single-node
    		networks:
    		  - elk
    
    	  logstash:
    		build:
    		  context: logstash/
    		  args:
    			ELK_VERSION: $ELK_VERSION
    		volumes:
    		  - type: bind
    			source: ./logstash/config/logstash.yml
    			target: /usr/share/logstash/config/logstash.yml
    			read_only: true
    		  - type: bind
    			source: ./logstash/pipeline
    			target: /usr/share/logstash/pipeline
    			read_only: true
    		ports:
    		  - "5000:5000/tcp"
    		  - "5000:5000/udp"
    		  - "9600:9600"
    		environment:
    		  LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    		networks:
    		  - elk
    		depends_on:
    		  - elasticsearch
    
    	  kibana:
    		build:
    		  context: kibana/
    		  args:
    			ELK_VERSION: $ELK_VERSION
    		volumes:
    		  - type: bind
    			source: ./kibana/config/kibana.yml
    			target: /usr/share/kibana/config/kibana.yml
    			read_only: true
    		ports:
    		  - "5601:5601"
    		networks:
    		  - elk
    		depends_on:
    		  - elasticsearch
    		  
    	networks:
    	  elk:
    		driver: bridge
    
    	volumes:
    	  elasticsearch:

Install Portainer for Docker

Scenario:

Install Portainer for Docker for UI for containers

Solution:

  1. Open powershell and run below commands
  2. $ docker volume create portainer_data
    $ docker run -d -p 8000:8000 -p 9000:9000 --name=portainer --restart=always -v /var/run/docker.sock:/var/run/docker.sock -v portainer_data:/data portainer/portainer
    --other optins
    In settings -> General -> Expose daemon on tcp://localhost:2375 without TLS
    netsh interface portproxy add v4tov4 listenaddress=10.0.75.1 listenport=2375 connectaddress=127.0.0.1 connectport=2375
    
    netsh advfirewall firewall add rule name="docker management" dir=in action=allow protocol=TCP localport=2375
    
    --10.0.75.1:2375 to the daemon socket on 127.0.0.1:2375, and the second line adds a pass-through on the firewall for the port 2375 (*)
    
    docker run -d --name portainer -p 9000:9000 -v portainer_data:/data portainer/portainer -H tcp://10.0.75.1:2375
    docker run -d --restart always --name portainer -v portainer_data:/data portainer/portainer -p 9002:9002 portainer/portainer
    
    docker run -d --name portainer -v portainer_data:/data portainer/portainer -p 9000:9000 portainer/portainer
    http://localhost:9000
      2. Navigate to http://localhost:9000

Docker for windows and common commands

Scenario:

Docker for windows and common commands

Solution:

  1. https://hub.docker.com/editions/community/docker-ce-desktop-windows/ Get Docker
  2. Docker Desktop Installer.exe
  3. Enable Hyper-v Windows Features
  4. Sign to Docker-Hub
  5. Below are some common commands

Move Github Sub Repository back to main repo

 -- delete .gitmodules git rm --cached MyProject/Core git commit -m 'Remove myproject_core submodule' rm -rf MyProject/Core git remo...