CopyPastor

Detecting plagiarism made easy.

Score: 1; Reported for: Exact paragraph match Open both answers

Possible Plagiarism

Reposted on 2024-05-22
by SiddheshDesai

Original Post

Original - Posted on 2023-10-17
by SiddheshDesai



            
Present in both answers; Present only in the new answer; Present only in the old answer;

In order to resolve **`ModuleNotFound error`** make sure you use the step below correctly in your Azure Devops yaml pipeline:-
```yaml - bash: | pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt workingDirectory: $(workingDirectory) displayName: 'Install application dependencies' ```
***Make sure you are selecting correct Branch that includes your Function code for your Pipeline:-***
![enter image description here](https://i.imgur.com/gAA6Afb.png)
**Azure Repository:-**
![enter image description here](https://i.imgur.com/G8npu8j.png)
***Complete yaml pipeline code for Durable Function:-***
```yaml trigger: - master
variables: azureSubscription: '7bxxxxxxxxb50eee'
functionAppName: 'valleyfunc8'
vmImageName: 'ubuntu-latest'
workingDirectory: '$(System.DefaultWorkingDirectory)/'
stages: - stage: Build displayName: Build stage
jobs: - job: Build displayName: Build pool: vmImage: $(vmImageName)
steps: - bash: | if [ -f extensions.csproj ] then dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin fi workingDirectory: $(workingDirectory) displayName: 'Build extensions'
- task: UsePythonVersion@0 displayName: 'Use Python 3.9' inputs: versionSpec: 3.9
- bash: | pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt workingDirectory: $(workingDirectory) displayName: 'Install application dependencies'
- task: ArchiveFiles@2 displayName: 'Archive files' inputs: rootFolderOrFile: '$(workingDirectory)' includeRootFolder: false archiveType: zip archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip artifact: drop
- stage: Deploy displayName: Deploy stage dependsOn: Build condition: succeeded()
jobs: - deployment: Deploy displayName: Deploy environment: 'development' pool: vmImage: $(vmImageName)
strategy: runOnce: deploy:
steps: - task: AzureFunctionApp@1 displayName: 'Azure functions app deploy' inputs: azureSubscription: '$(azureSubscription)' appType: functionAppLinux appName: $(functionAppName) package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip' ```
My **requirements.txt:-**
```txt azure-functions azure-functions-durable azure-identity==1.14.0 azure-keyvault-secrets==4.7.0 azure-storage-blob==12.17.0 dotmap==1.3.30 holidays==0.32 json5==0.9.14 numpy==1.25.2 pandas==2.1.0 pyodbc==4.0.39 pytz==2023.3.post1 requests_oauthlib==1.3.1 sqlalchemy==2.0.20 ```
**Sample Durable Function code:-**
```python import logging import json
from azure.durable_functions import DurableOrchestrationContext, Orchestrator import azure.functions as func from azure.identity import DefaultAzureCredential from azure.keyvault.secrets import SecretClient from azure.storage.blob import BlobServiceClient import numpy as np import pandas as pd
# Orchestrator Function def orchestrator_function(context: DurableOrchestrationContext): result1 = yield context.call_activity('HelloSecret', "Tokyo") result2 = yield context.call_activity('HelloBlob', "Seattle") result3 = yield context.call_activity('HelloData', "London") return [result1, result2, result3]
main = Orchestrator.create(orchestrator_function)
# Activity Function to Read Secret def hello_secret_activity(name: str) -> str: logging.info(f"Reading secret for: {name}")
# Initialize the DefaultAzureCredential credential = DefaultAzureCredential() # Access a secret from Azure Key Vault key_vault_url = "https://<your-key-vault-name>.vault.azure.net/" secret_client = SecretClient(vault_url=key_vault_url, credential=credential) secret_name = "<your-secret-name>" secret = secret_client.get_secret(secret_name) return f"Secret for {name}: {secret.value}"
# Activity Function to List Blobs def hello_blob_activity(name: str) -> str: logging.info(f"Listing blobs for: {name}")
# Initialize the DefaultAzureCredential credential = DefaultAzureCredential() # Access Azure Blob Storage blob_service_client = BlobServiceClient(account_url="https://<your-storage-account-name>.blob.core.windows.net/", credential=credential) container_client = blob_service_client.get_container_client("your-container-name") blob_list = container_client.list_blobs() blob_names = [blob.name for blob in blob_list] return f"Blobs for {name}: {blob_names}"
# Activity Function to Perform Data Processing def hello_data_activity(name: str) -> str: logging.info(f"Performing data processing for: {name}")
# Example usage of pandas and numpy data = { 'A': np.random.rand(10), 'B': np.random.rand(10), } df = pd.DataFrame(data) df_summary = df.describe().to_json() return f"Data summary for {name}: {df_summary}"
# HTTP Trigger to Start the Orchestration async def http_start(req: func.HttpRequest, starter: str) -> func.HttpResponse: client = df.DurableOrchestrationClient(starter) instance_id = await client.start_new(req.route_params["functionName"], None, None)
logging.info(f"Started orchestration with ID = '{instance_id}'.")
return func.HttpResponse(f"Started orchestration with ID = '{instance_id}'.", status_code=202) ```
**Sample Http Trigger code:-**
```python import logging import azure.functions as func from azure.identity import DefaultAzureCredential from azure.keyvault.secrets import SecretClient from azure.storage.blob import BlobServiceClient from dotmap import DotMap import holidays import json5 import numpy as np import pandas as pd import pyodbc import pytz from requests_oauthlib import OAuth2Session from sqlalchemy import create_engine
def main(req: func.HttpRequest) -> func.HttpResponse: logging.info('Python HTTP trigger function processed a request.')
# Initialize the DefaultAzureCredential credential = DefaultAzureCredential() # Access a secret from Azure Key Vault key_vault_url = "https://siliconkeyvault9.vault.azure.net/" secret_client = SecretClient(vault_url=key_vault_url, credential=credential) secret_name = "secret3" secret = secret_client.get_secret(secret_name) logging.info(f"Secret: {secret.value}")
# Access Azure Blob Storage blob_service_client = BlobServiceClient(account_url="https://siliconstrg8.blob.core.windows.net/", credential=credential) container_client = blob_service_client.get_container_client("your-container-name") blob_list = container_client.list_blobs() blob_names = [blob.name for blob in blob_list] logging.info(f"Blobs in container: {blob_names}")
# Example usage of pandas and numpy data = { 'A': np.random.rand(10), 'B': np.random.rand(10), } df = pd.DataFrame(data) df_summary = df.describe().to_json() # Example usage of holidays us_holidays = holidays.US(years=2024) holidays_list = [str(date) for date in us_holidays]
response = { "secret_value": secret.value, "blobs": blob_names, "data_summary": json5.loads(df_summary), "holidays": holidays_list }
return func.HttpResponse(json5.dumps(response), mimetype="application/json") ```
***Sample yaml code for Http Trigger code:-***
```yaml trigger: - main
variables: azureSubscription: '4d3e6xxxxxxf22707ba'
functionAppName: 'siliconfuncapp'
vmImageName: 'ubuntu-latest'
workingDirectory: '$(System.DefaultWorkingDirectory)/'
stages: - stage: Build displayName: Build stage
jobs: - job: Build displayName: Build pool: vmImage: $(vmImageName)
steps: - bash: | if [ -f extensions.csproj ] then dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin fi workingDirectory: $(workingDirectory) displayName: 'Build extensions'
- task: UsePythonVersion@0 displayName: 'Use Python 3.9' inputs: versionSpec: 3.9 - bash: | pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt workingDirectory: $(workingDirectory) displayName: 'Install application dependencies'
- task: ArchiveFiles@2 displayName: 'Archive files' inputs: rootFolderOrFile: '$(workingDirectory)' includeRootFolder: false archiveType: zip archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip artifact: drop
- stage: Deploy displayName: Deploy stage dependsOn: Build condition: succeeded()
jobs: - deployment: Deploy displayName: Deploy environment: 'development' pool: vmImage: $(vmImageName)
strategy: runOnce: deploy:
steps: - task: AzureFunctionApp@1 displayName: 'Azure functions app deploy' inputs: azureSubscription: '$(azureSubscription)' appType: functionAppLinux appName: $(functionAppName) package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip' ```
**Output:-**
**For Durable Function:-**
**`holidays`** package is installed correctly:-
![enter image description here](https://i.imgur.com/HkSRPms.png)
![enter image description here](https://i.imgur.com/vFnh5GJ.png)
**For Http Trigger:-**
**`holidays`** package is installed correctly:-
![enter image description here](https://i.imgur.com/UaoDeRG.png)
![enter image description here](https://i.imgur.com/O41NHpu.png)
> Is this the correct way to accomplish this? The Python Functions documentation seems to suggest that the requirements.txt should be processed during the build, but it absolutely is not.
Yes, The requirements.txt packages needs to be installed in the DevOps agent and then the build artifact with the packages are deployed to the Function app for the Function Trigger to work. Even the Default Azure DevOps yaml code to Deploy the Python Function has separate pip install step to install packages. Refer below:-
I have used the Template below and selected my Function app and path set to - **`$(System.DefaultWorkingDirectory)`**
**My Azure DevOps yaml script:-**
![enter image description here](https://i.imgur.com/Yx6QVBe.png)
```yaml trigger: - master
variables: azureSubscription: 'xxxxxxxxxx354dd'
# Function app name functionAppName: 'siliconfunc76'
# Agent VM image name vmImageName: 'ubuntu-latest'
# Working Directory workingDirectory: '$(System.DefaultWorkingDirectory)'
stages: - stage: Build displayName: Build stage
jobs: - job: Build displayName: Build pool: vmImage: $(vmImageName)
steps: - bash: | if [ -f extensions.csproj ] then dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin fi workingDirectory: $(workingDirectory) displayName: 'Build extensions'
- task: UsePythonVersion@0 displayName: 'Use Python 3.10' inputs: versionSpec: 3.10 # Functions V2 supports Python 3.6 as of today
- bash: | pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt workingDirectory: $(workingDirectory) displayName: 'Install application dependencies'
- task: ArchiveFiles@2 displayName: 'Archive files' inputs: rootFolderOrFile: '$(workingDirectory)' includeRootFolder: false archiveType: zip archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip artifact: drop
- stage: Deploy displayName: Deploy stage dependsOn: Build condition: succeeded()
jobs: - deployment: Deploy displayName: Deploy environment: 'development' pool: vmImage: $(vmImageName)
strategy: runOnce: deploy:
steps: - task: AzureFunctionApp@1 displayName: 'Azure functions app deploy' inputs: azureSubscription: '$(azureSubscription)' appType: functionAppLinux appName: $(functionAppName) package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip' ```
**Output:-**
All the package dependencies are installed in the Build step where Function is built on Azure DevOps agent like below:-
![enter image description here](https://i.imgur.com/ssjMYI2.png)
Those packages are Archived as an artifact like below:-
![enter image description here](https://i.imgur.com/0smubho.png)
Then the Archived folder is published as an Artifact like below:-
![enter image description here](https://i.imgur.com/TkYmmVw.png)
In the next stage those Published artifact is Downloaded and Deployed in the Function App like below:-
![enter image description here](https://i.imgur.com/ORWdYsf.png)
![enter image description here](https://i.imgur.com/SBpWYho.png)
**Above method is the Ideal and efficient method for now to Deploy the Function app**.
- You can only build the Function app with the Build step and use Release pipeline to Deploy the Function App via Release Refer my [SO thread answer1](https://stackoverflow.com/questions/75877443/devops-release-pipeline-zip-file-into-azure-function-app-deployment/75884713#75884713) for the same. - Alternatively you can Refer My YAML script from this [SO thread answer2](https://stackoverflow.com/questions/77148969/azure-function-publish-reduce-scm-environment-setting-timout-duration/77215510#77215510) where I have utilized **`func azure functionapp publish functionappname --python`** And installed the dependencies with **`pip install`** in the same build step. - To Deploy FunctionV2 using Azure DevOps refer my [SO thread answer3](https://stackoverflow.com/questions/76559759/azure-functions-python-programming-model-v2-1-found-0-loaded/76562032#76562032).

        
Present in both answers; Present only in the new answer; Present only in the old answer;