Repository: robbrad/UKBinCollectionData Branch: master Commit: 60bd3ccee981 Files: 442 Total size: 3.7 MB Directory structure: gitextract_9uhgfajq/ ├── .devcontainer/ │ ├── dev.Dockerfile │ ├── devcontainer.json │ └── docker-compose.yml ├── .dockerignore ├── .github/ │ ├── ISSUE_TEMPLATE/ │ │ ├── COUNCIL_ISSUE.yaml │ │ ├── COUNCIL_REQUEST.yaml │ │ └── HOME_ASSISTANT_CUSTOM_COMPONENT_ISSUE.yaml │ ├── dependabot.yaml │ └── workflows/ │ ├── behave_pull_request.yml │ ├── behave_schedule.yml │ ├── bump.yml │ ├── codeql-analysis.yml │ ├── docker-image.yml │ ├── ha_compatibility_test.yml │ ├── hacs_validation.yml │ ├── lint.yml │ ├── release.yml │ ├── rollback-release.yml │ ├── validate-release-ready.yml │ └── wiki.yml ├── .gitignore ├── CHANGELOG.md ├── CODE_OF_CONDUCT.md ├── COMPATIBILITY.md ├── CONTRIBUTING.md ├── LICENSE ├── Makefile ├── README.md ├── behave.ini ├── bin_to_ics.py ├── conftest.py ├── custom_components/ │ ├── __init__.py │ └── uk_bin_collection/ │ ├── README.md │ ├── __init__.py │ ├── calendar.py │ ├── config_flow.py │ ├── const.py │ ├── manifest.json │ ├── sensor.py │ ├── services.yaml │ ├── strings.json │ ├── tests/ │ │ ├── __init__.py │ │ ├── common_utils.py │ │ ├── test_calendar.py │ │ ├── test_config_flow.py │ │ ├── test_init.py │ │ └── test_sensor.py │ └── translations/ │ ├── cy.json │ ├── en.json │ ├── ga.json │ ├── gd.json │ └── pt.json ├── docs/ │ ├── RELEASE-SETUP-SUMMARY.md │ ├── deploy-key-setup.md │ ├── example_council.md │ ├── github-app-setup.md │ ├── github-app-troubleshooting.md │ ├── manual-tag-fix.md │ ├── release-quick-reference.md │ ├── release-workflow-branch-protection.md │ ├── release-workflow-diagram.md │ ├── release-workflow-fixes.md │ ├── release-workflow-migration.md │ ├── release-workflow-setup-checklist.md │ ├── release-workflow.md │ ├── rollback-release.md │ ├── utilities.md │ ├── workflow-improvements-summary.md │ └── workflow-naming-conventions.md ├── hacs.json ├── poetry.lock ├── pyproject.toml ├── pytest.ini ├── scripts/ │ └── check_ha_compatibility.py ├── uk_bin_collection/ │ ├── Local_Authority_Boundaries.geojson │ ├── README.rst │ ├── compare_lad_codes.py │ ├── map.html │ ├── tests/ │ │ ├── check_selenium_url_in_input.json.py │ │ ├── council_feature_input_parity.py │ │ ├── features/ │ │ │ ├── environment.py │ │ │ └── validate_council_outputs.feature │ │ ├── generate_map_test_results.py │ │ ├── input.json │ │ ├── output.schema │ │ ├── step_defs/ │ │ │ ├── step_helpers/ │ │ │ │ └── file_handler.py │ │ │ └── test_validate_council.py │ │ ├── test_collect_data.py │ │ ├── test_common_functions.py │ │ ├── test_conftest.py │ │ └── test_get_data.py │ └── uk_bin_collection/ │ ├── collect_data.py │ ├── common.py │ ├── councils/ │ │ ├── AberdeenCityCouncil.py │ │ ├── AberdeenshireCouncil.py │ │ ├── AdurAndWorthingCouncils.py │ │ ├── AmberValleyBoroughCouncil.py │ │ ├── AngusCouncil.py │ │ ├── AntrimAndNewtonabbeyCouncil.py │ │ ├── ArdsAndNorthDownCouncil.py │ │ ├── ArgyllandButeCouncil.py │ │ ├── ArmaghBanbridgeCraigavonCouncil.py │ │ ├── ArunCouncil.py │ │ ├── AshfieldDistrictCouncil.py │ │ ├── AshfordBoroughCouncil.py │ │ ├── BCPCouncil.py │ │ ├── BaberghDistrictCouncil.py │ │ ├── BarkingDagenham.py │ │ ├── BarnetCouncil.py │ │ ├── BarnsleyMBCouncil.py │ │ ├── BasildonCouncil.py │ │ ├── BasingstokeCouncil.py │ │ ├── BathAndNorthEastSomersetCouncil.py │ │ ├── BedfordBoroughCouncil.py │ │ ├── BedfordshireCouncil.py │ │ ├── BelfastCityCouncil.py │ │ ├── BexleyCouncil.py │ │ ├── BirminghamCityCouncil.py │ │ ├── BlabyDistrictCouncil.py │ │ ├── BlackburnCouncil.py │ │ ├── BlackpoolCouncil.py │ │ ├── BlaenauGwentCountyBoroughCouncil.py │ │ ├── BolsoverCouncil.py │ │ ├── BoltonCouncil.py │ │ ├── BostonBoroughCouncil.py │ │ ├── BracknellForestCouncil.py │ │ ├── BradfordMDC.py │ │ ├── BraintreeDistrictCouncil.py │ │ ├── BrecklandCouncil.py │ │ ├── BrentCouncil.py │ │ ├── BrightonandHoveCityCouncil.py │ │ ├── BristolCityCouncil.py │ │ ├── BroadlandDistrictCouncil.py │ │ ├── BromleyBoroughCouncil.py │ │ ├── BromsgroveDistrictCouncil.py │ │ ├── BroxbourneCouncil.py │ │ ├── BroxtoweBoroughCouncil.py │ │ ├── BuckinghamshireCouncil.py │ │ ├── BurnleyBoroughCouncil.py │ │ ├── BuryCouncil.py │ │ ├── CalderdaleCouncil.py │ │ ├── CambridgeCityCouncil.py │ │ ├── CannockChaseDistrictCouncil.py │ │ ├── CanterburyCityCouncil.py │ │ ├── CardiffCouncil.py │ │ ├── CarmarthenshireCountyCouncil.py │ │ ├── CastlepointDistrictCouncil.py │ │ ├── CeredigionCountyCouncil.py │ │ ├── CharnwoodBoroughCouncil.py │ │ ├── ChelmsfordCityCouncil.py │ │ ├── CheltenhamBoroughCouncil.py │ │ ├── CherwellDistrictCouncil.py │ │ ├── CheshireEastCouncil.py │ │ ├── CheshireWestAndChesterCouncil.py │ │ ├── ChesterfieldBoroughCouncil.py │ │ ├── ChichesterDistrictCouncil.py │ │ ├── ChorleyCouncil.py │ │ ├── ColchesterCityCouncil.py │ │ ├── ConwyCountyBorough.py │ │ ├── CornwallCouncil.py │ │ ├── CotswoldDistrictCouncil.py │ │ ├── CoventryCityCouncil.py │ │ ├── CrawleyBoroughCouncil.py │ │ ├── CroydonCouncil.py │ │ ├── CumberlandCouncil.py │ │ ├── DacorumBoroughCouncil.py │ │ ├── DarlingtonBoroughCouncil.py │ │ ├── DartfordBoroughCouncil.py │ │ ├── DenbighshireCouncil.py │ │ ├── DerbyCityCouncil.py │ │ ├── DerbyshireDalesDistrictCouncil.py │ │ ├── DoncasterCouncil.py │ │ ├── DorsetCouncil.py │ │ ├── DoverDistrictCouncil.py │ │ ├── DudleyCouncil.py │ │ ├── DumfriesandGallowayCouncil.py │ │ ├── DundeeCityCouncil.py │ │ ├── DurhamCouncil.py │ │ ├── EalingCouncil.py │ │ ├── EastAyrshireCouncil.py │ │ ├── EastCambridgeshireCouncil.py │ │ ├── EastDevonDC.py │ │ ├── EastDunbartonshireCouncil.py │ │ ├── EastHertsCouncil.py │ │ ├── EastLindseyDistrictCouncil.py │ │ ├── EastLothianCouncil.py │ │ ├── EastRenfrewshireCouncil.py │ │ ├── EastRidingCouncil.py │ │ ├── EastStaffordshireBoroughCouncil.py │ │ ├── EastSuffolkCouncil.py │ │ ├── EastbourneBoroughCouncil.py │ │ ├── EastleighBoroughCouncil.py │ │ ├── EdenDistrictCouncil.py │ │ ├── EdinburghCityCouncil.py │ │ ├── ElmbridgeBoroughCouncil.py │ │ ├── EnfieldCouncil.py │ │ ├── EnvironmentFirst.py │ │ ├── EppingForestDistrictCouncil.py │ │ ├── EpsomandEwellBoroughCouncil.py │ │ ├── ErewashBoroughCouncil.py │ │ ├── ExeterCityCouncil.py │ │ ├── FalkirkCouncil.py │ │ ├── FarehamBoroughCouncil.py │ │ ├── FenlandDistrictCouncil.py │ │ ├── FermanaghOmaghDistrictCouncil.py │ │ ├── FifeCouncil.py │ │ ├── FlintshireCountyCouncil.py │ │ ├── FolkestoneandHytheDistrictCouncil.py │ │ ├── ForestOfDeanDistrictCouncil.py │ │ ├── FyldeCouncil.py │ │ ├── GatesheadCouncil.py │ │ ├── GedlingBoroughCouncil.py │ │ ├── GlasgowCityCouncil.py │ │ ├── GloucesterCityCouncil.py │ │ ├── GooglePublicCalendarCouncil.py │ │ ├── GosportBoroughCouncil.py │ │ ├── GraveshamBoroughCouncil.py │ │ ├── GreatYarmouthBoroughCouncil.py │ │ ├── GuildfordCouncil.py │ │ ├── GwyneddCouncil.py │ │ ├── HackneyCouncil.py │ │ ├── HaltonBoroughCouncil.py │ │ ├── HarboroughDistrictCouncil.py │ │ ├── HaringeyCouncil.py │ │ ├── HarlowCouncil.py │ │ ├── HarrogateBoroughCouncil.py │ │ ├── HartDistrictCouncil.py │ │ ├── HartlepoolBoroughCouncil.py │ │ ├── HastingsBoroughCouncil.py │ │ ├── HerefordshireCouncil.py │ │ ├── HertsmereBoroughCouncil.py │ │ ├── HighPeakCouncil.py │ │ ├── HighlandCouncil.py │ │ ├── Hillingdon.py │ │ ├── HinckleyandBosworthBoroughCouncil.py │ │ ├── HorshamDistrictCouncil.py │ │ ├── HullCityCouncil.py │ │ ├── HuntingdonDistrictCouncil.py │ │ ├── HyndburnBoroughCouncil.py │ │ ├── IpswichBoroughCouncil.py │ │ ├── IsleOfAngleseyCouncil.py │ │ ├── IslingtonCouncil.py │ │ ├── KingsLynnandWestNorfolkBC.py │ │ ├── KingstonUponThamesCouncil.py │ │ ├── KirkleesCouncil.py │ │ ├── KnowsleyMBCouncil.py │ │ ├── LancasterCityCouncil.py │ │ ├── LeedsCityCouncil.py │ │ ├── LeicesterCityCouncil.py │ │ ├── LewesDistrictCouncil.py │ │ ├── LichfieldDistrictCouncil.py │ │ ├── LincolnCouncil.py │ │ ├── LisburnCastlereaghCityCouncil.py │ │ ├── LiverpoolCityCouncil.py │ │ ├── LondonBoroughCamdenCouncil.py │ │ ├── LondonBoroughEaling.py │ │ ├── LondonBoroughHammersmithandFulham.py │ │ ├── LondonBoroughHarrow.py │ │ ├── LondonBoroughHavering.py │ │ ├── LondonBoroughHounslow.py │ │ ├── LondonBoroughLambeth.py │ │ ├── LondonBoroughLewisham.py │ │ ├── LondonBoroughOfRichmondUponThames.py │ │ ├── LondonBoroughRedbridge.py │ │ ├── LondonBoroughSutton.py │ │ ├── LutonBoroughCouncil.py │ │ ├── MaidstoneBoroughCouncil.py │ │ ├── MaldonDistrictCouncil.py │ │ ├── MalvernHillsDC.py │ │ ├── ManchesterCityCouncil.py │ │ ├── MansfieldDistrictCouncil.py │ │ ├── MedwayCouncil.py │ │ ├── MeltonBoroughCouncil.py │ │ ├── MertonCouncil.py │ │ ├── MidAndEastAntrimBoroughCouncil.py │ │ ├── MidDevonCouncil.py │ │ ├── MidSuffolkDistrictCouncil.py │ │ ├── MidSussexDistrictCouncil.py │ │ ├── MidUlsterDistrictCouncil.py │ │ ├── MiddlesbroughCouncil.py │ │ ├── MidlothianCouncil.py │ │ ├── MiltonKeynesCityCouncil.py │ │ ├── MoleValleyDistrictCouncil.py │ │ ├── MonmouthshireCountyCouncil.py │ │ ├── MorayCouncil.py │ │ ├── NeathPortTalbotCouncil.py │ │ ├── NewForestCouncil.py │ │ ├── NewarkAndSherwoodDC.py │ │ ├── NewcastleCityCouncil.py │ │ ├── NewcastleUnderLymeCouncil.py │ │ ├── NewhamCouncil.py │ │ ├── NewportCityCouncil.py │ │ ├── NorthAyrshireCouncil.py │ │ ├── NorthDevonCountyCouncil.py │ │ ├── NorthEastDerbyshireDistrictCouncil.py │ │ ├── NorthEastLincs.py │ │ ├── NorthHertfordshireDistrictCouncil.py │ │ ├── NorthKestevenDistrictCouncil.py │ │ ├── NorthLanarkshireCouncil.py │ │ ├── NorthLincolnshireCouncil.py │ │ ├── NorthNorfolkDistrictCouncil.py │ │ ├── NorthNorthamptonshireCouncil.py │ │ ├── NorthSomersetCouncil.py │ │ ├── NorthTynesideCouncil.py │ │ ├── NorthWarwickshireBoroughCouncil.py │ │ ├── NorthWestLeicestershire.py │ │ ├── NorthYorkshire.py │ │ ├── NorthumberlandCouncil.py │ │ ├── NorwichCityCouncil.py │ │ ├── NottinghamCityCouncil.py │ │ ├── NuneatonBedworthBoroughCouncil.py │ │ ├── OadbyAndWigstonBoroughCouncil.py │ │ ├── OldhamCouncil.py │ │ ├── OxfordCityCouncil.py │ │ ├── PembrokeshireCountyCouncil.py │ │ ├── PerthAndKinrossCouncil.py │ │ ├── PeterboroughCityCouncil.py │ │ ├── PlymouthCouncil.py │ │ ├── PortsmouthCityCouncil.py │ │ ├── PowysCouncil.py │ │ ├── PrestonCityCouncil.py │ │ ├── ReadingBoroughCouncil.py │ │ ├── RedcarandClevelandCouncil.py │ │ ├── RedditchBoroughCouncil.py │ │ ├── ReigateAndBansteadBoroughCouncil.py │ │ ├── RenfrewshireCouncil.py │ │ ├── RhonddaCynonTaffCouncil.py │ │ ├── RochdaleCouncil.py │ │ ├── RochfordCouncil.py │ │ ├── RotherDistrictCouncil.py │ │ ├── RotherhamCouncil.py │ │ ├── RoyalBoroughofGreenwich.py │ │ ├── RugbyBoroughCouncil.py │ │ ├── RunnymedeBoroughCouncil.py │ │ ├── RushcliffeBoroughCouncil.py │ │ ├── RushmoorCouncil.py │ │ ├── SalfordCityCouncil.py │ │ ├── SandwellBoroughCouncil.py │ │ ├── SeftonCouncil.py │ │ ├── SevenoaksDistrictCouncil.py │ │ ├── SheffieldCityCouncil.py │ │ ├── ShropshireCouncil.py │ │ ├── SloughBoroughCouncil.py │ │ ├── SolihullCouncil.py │ │ ├── SomersetCouncil.py │ │ ├── SouthAyrshireCouncil.py │ │ ├── SouthCambridgeshireCouncil.py │ │ ├── SouthDerbyshireDistrictCouncil.py │ │ ├── SouthGloucestershireCouncil.py │ │ ├── SouthHamsDistrictCouncil.py │ │ ├── SouthHollandDistrictCouncil.py │ │ ├── SouthKestevenDistrictCouncil.py │ │ ├── SouthLanarkshireCouncil.py │ │ ├── SouthNorfolkCouncil.py │ │ ├── SouthOxfordshireCouncil.py │ │ ├── SouthRibbleCouncil.py │ │ ├── SouthStaffordshireDistrictCouncil.py │ │ ├── SouthTynesideCouncil.py │ │ ├── SouthamptonCityCouncil.py │ │ ├── SouthwarkCouncil.py │ │ ├── SpelthorneBoroughCouncil.py │ │ ├── StAlbansCityAndDistrictCouncil.py │ │ ├── StHelensBC.py │ │ ├── StaffordBoroughCouncil.py │ │ ├── StaffordshireMoorlandsDistrictCouncil.py │ │ ├── StevenageBoroughCouncil.py │ │ ├── StirlingCouncil.py │ │ ├── StockportBoroughCouncil.py │ │ ├── StocktonOnTeesCouncil.py │ │ ├── StokeOnTrentCityCouncil.py │ │ ├── StratfordUponAvonCouncil.py │ │ ├── StroudDistrictCouncil.py │ │ ├── SunderlandCityCouncil.py │ │ ├── SurreyHeathBoroughCouncil.py │ │ ├── SwaleBoroughCouncil.py │ │ ├── SwanseaCouncil.py │ │ ├── SwindonBoroughCouncil.py │ │ ├── TamesideMBCouncil.py │ │ ├── TandridgeDistrictCouncil.py │ │ ├── TeignbridgeCouncil.py │ │ ├── TelfordAndWrekinCouncil.py │ │ ├── TendringDistrictCouncil.py │ │ ├── TestValleyBoroughCouncil.py │ │ ├── TewkesburyBoroughCouncil.py │ │ ├── ThanetDistrictCouncil.py │ │ ├── ThreeRiversDistrictCouncil.py │ │ ├── ThurrockCouncil.py │ │ ├── TonbridgeAndMallingBC.py │ │ ├── TorbayCouncil.py │ │ ├── TorridgeDistrictCouncil.py │ │ ├── TunbridgeWellsCouncil.py │ │ ├── UttlesfordDistrictCouncil.py │ │ ├── ValeofGlamorganCouncil.py │ │ ├── ValeofWhiteHorseCouncil.py │ │ ├── WakefieldCityCouncil.py │ │ ├── WalsallCouncil.py │ │ ├── WalthamForest.py │ │ ├── WandsworthCouncil.py │ │ ├── WarringtonBoroughCouncil.py │ │ ├── WarwickDistrictCouncil.py │ │ ├── WatfordBoroughCouncil.py │ │ ├── WaverleyBoroughCouncil.py │ │ ├── WealdenDistrictCouncil.py │ │ ├── WelhatCouncil.py │ │ ├── WestBerkshireCouncil.py │ │ ├── WestDunbartonshireCouncil.py │ │ ├── WestLancashireBoroughCouncil.py │ │ ├── WestLindseyDistrictCouncil.py │ │ ├── WestLothianCouncil.py │ │ ├── WestMorlandAndFurness.py │ │ ├── WestNorthamptonshireCouncil.py │ │ ├── WestOxfordshireDistrictCouncil.py │ │ ├── WestSuffolkCouncil.py │ │ ├── WiganBoroughCouncil.py │ │ ├── WiltshireCouncil.py │ │ ├── WinchesterCityCouncil.py │ │ ├── WindsorAndMaidenheadCouncil.py │ │ ├── WirralCouncil.py │ │ ├── WokingBoroughCouncil.py │ │ ├── WokinghamBoroughCouncil.py │ │ ├── WolverhamptonCityCouncil.py │ │ ├── WorcesterCityCouncil.py │ │ ├── WrexhamCountyBoroughCouncil.py │ │ ├── WychavonDistrictCouncil.py │ │ ├── WyreCouncil.py │ │ ├── WyreForestDistrictCouncil.py │ │ ├── YorkCouncil.py │ │ ├── council_class_template/ │ │ │ └── councilclasstemplate.py │ │ └── tests/ │ │ ├── conftest.py │ │ ├── test_south_kesteven_district_council.py │ │ └── test_south_kesteven_integration.py │ ├── create_new_council.py │ └── get_bin_data.py ├── uk_bin_collection_api_server/ │ ├── Dockerfile │ ├── docker-compose.yml │ ├── requirements.txt │ ├── server.py │ └── swagger.yaml └── wiki/ ├── Councils.md ├── Home.md ├── Setup.md └── generate_wiki.py ================================================ FILE CONTENTS ================================================ ================================================ FILE: .devcontainer/dev.Dockerfile ================================================ ARG VARIANT="3.12-bullseye" FROM mcr.microsoft.com/devcontainers/python:${VARIANT} AS ukbc-dev-base USER root # Install dependencies for Google Chrome RUN dpkg --add-architecture amd64 && \ apt-get update && \ apt-get install -y --no-install-recommends \ wget \ gnupg2 \ software-properties-common \ apt-transport-https \ ca-certificates \ unzip \ libasound2 \ libatk-bridge2.0-0 \ libatk1.0-0 \ libatspi2.0-0 \ libcairo2 \ libcups2 \ libdbus-1-3 \ libexpat1 \ libgbm1 \ libglib2.0-0 \ libgtk-3-0 \ libnspr4 \ libnss3 \ libpango-1.0-0 \ libudev1 \ libvulkan1 \ libx11-6 \ libxcb1 \ libxcomposite1 \ libxdamage1 \ libxext6 \ libxfixes3 \ libxkbcommon0 \ libxrandr2 \ libcurl4 && \ apt-get clean && \ rm -rf /var/lib/apt/lists/* # Add Google Chrome repository RUN wget -q -O - https://dl.google.com/linux/linux_signing_key.pub | apt-key add - && \ echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" > /etc/apt/sources.list.d/google-chrome.list && \ apt-get update # Install Chrome RUN apt-get install -y google-chrome-stable && \ apt-get clean && \ rm -rf /var/lib/apt/lists/* # Install ChromeDriver RUN CHROME_VERSION=$(google-chrome --version | sed 's/Google Chrome //' | tr -d ' ') && \ wget -O /tmp/chromedriver.zip "https://storage.googleapis.com/chrome-for-testing-public/${CHROME_VERSION}/linux64/chromedriver-linux64.zip" && \ unzip /tmp/chromedriver.zip -d /tmp && \ mv /tmp/chromedriver-linux64/chromedriver /usr/local/bin/ && \ rm -rf /tmp/chromedriver* && \ chmod +x /usr/local/bin/chromedriver USER vscode # Define the version of Poetry to install (default is 1.4.2) # Define the directory of python virtual environment ARG PYTHON_VIRTUALENV_HOME=/home/vscode/ukbc-py-env \ POETRY_VERSION=1.8.4 ENV POETRY_VIRTUALENVS_IN_PROJECT=false \ POETRY_NO_INTERACTION=true # Install Poetry outside of the virtual environment to avoid conflicts RUN python3 -m pip install --user pipx && \ python3 -m pipx ensurepath && \ pipx install poetry==${POETRY_VERSION} # Create a Python virtual environment for the project RUN python3 -m venv ${PYTHON_VIRTUALENV_HOME} && \ $PYTHON_VIRTUALENV_HOME/bin/pip install --upgrade pip ENV PATH="$PYTHON_VIRTUALENV_HOME/bin:$PATH" \ VIRTUAL_ENV=$PYTHON_VIRTUALENV_HOME # Setup for bash RUN poetry completions bash >> /home/vscode/.bash_completion && \ echo "export PATH=$PYTHON_VIRTUALENV_HOME/bin:$PATH" >> ~/.bashrc # Set the working directory for the app WORKDIR /ukbc_build # Use a multi-stage build to install dependencies FROM ukbc-dev-base AS ukbc-dev-dependencies ARG PYTHON_VIRTUALENV_HOME COPY . /ukbc_build/ RUN poetry install --no-interaction --no-ansi --with dev #docker build -f .devcontainer/dev.Dockerfile -t ukbc_dev_container . ================================================ FILE: .devcontainer/devcontainer.json ================================================ { "dockerComposeFile": "docker-compose.yml", "service": "devcontainer", "workspaceFolder": "/workspaces/UKBinCollectionData", "customizations": { "vscode": { "extensions": [ "alexkrechik.cucumberautocomplete", "eamodio.gitlens", "Gruntfuggly.todo-tree", "ms-python.black-formatter", "ms-python.isort", "ms-python.pylint", // Add pylint extension "ms-python.python", "ms-python.python", "ms-python.vscode-pep8", // Add pep8 extension "ms-python.vscode-pylance", "oderwat.indent-rainbow", "ryanluker.vscode-coverage-gutters", "yzhang.markdown-all-in-one" ], "settings": { "makefile.makefilePath": "${workspaceFolder}/", "files.exclude": { "**/__pycache__": true, "**/.pytest_cache": true }, "autoSave": true, "git.autorefresh": true, "extensions.ignoreRecommendations": true, "isort.args": [ "--profile", "black" ], "python.analysis.diagnosticMode": "workspace", "python.analysis.typeCheckingMode": "strict", "python.analysis.logLevel": "Trace", "python.analysis.disableSemanticOnNoPython": false, "python.analysis.enableSyncServer": true, "python.analysis.userFileIndexingLimit": -1, "python.formatting.provider": "none", "python.languageServer": "Pylance", "python.linting.enabled": true, "python.linting.pylintEnabled": true, "python.linting.pep8Enabled": true, "python.linting.lintOnSave": true, "python.testing.autoTestDiscoverOnSaveEnabled": false, "python.defaultInterpreterPath": "/home/vscode/ukbc-py-env", "python.testing.pytestArgs": [ "${workspaceFolder}/uk_bin_collection", "${workspaceFolder}/custom_components/uk_bin_collection/tests", "--headless=False", "-o cache_dir=${workspaceFolder}/.pytest_cache" ], "python.testing.unittestEnabled": false, "python.testing.pytestEnabled": true, "[python]": { "editor.defaultFormatter": "ms-python.black-formatter", "editor.formatOnSave": true, "editor.formatOnPaste": false, "editor.formatOnSaveMode": "file", "editor.codeActionsOnSave": { "source.organizeImports": true } }, "workbench.colorCustomizations": { "editorError.foreground": "#ff000088", "editorWarning.foreground": "#ffe60033", "editorInfo.foreground": "#00ff0088" } } } } } ================================================ FILE: .devcontainer/docker-compose.yml ================================================ services: devcontainer: image: ukbc_dev_container # This tags the built image build: context: ../ # Path to the directory containing the Dockerfile dockerfile: .devcontainer/dev.Dockerfile volumes: - ../:/workspaces/UKBinCollectionData:rw privileged: true hostname: devcontainer network_mode: host depends_on: - selenium-hub command: sleep infinity homeassistant: container_name: homeassistant image: "ghcr.io/home-assistant/home-assistant:stable" volumes: - .ha_config:/config:rw - ../custom_components:/config/custom_components - /etc/localtime:/etc/localtime:ro - /run/dbus:/run/dbus:ro restart: unless-stopped privileged: true networks: - devnet ports: - "8124:8123/tcp" chrome1: image: selenium/node-chrome:4.20.0-20240505 shm_size: 2gb networks: - devnet depends_on: - selenium-hub ports: - "7901:7900" - "5551:5555" environment: - SE_EVENT_BUS_HOST=selenium-hub - SE_EVENT_BUS_PUBLISH_PORT=4442 - SE_EVENT_BUS_SUBSCRIBE_PORT=4443 - VNC_NO_PASSWORD=1 privileged: true restart: always chrome2: image: selenium/node-chrome:4.20.0-20240505 shm_size: 2gb networks: - devnet depends_on: - selenium-hub ports: - "7902:7900" - "5552:5555" environment: - SE_EVENT_BUS_HOST=selenium-hub - SE_EVENT_BUS_PUBLISH_PORT=4442 - SE_EVENT_BUS_SUBSCRIBE_PORT=4443 - VNC_NO_PASSWORD=1 privileged: true restart: always chrome3: image: selenium/node-chrome:4.20.0-20240505 shm_size: 2gb networks: - devnet depends_on: - selenium-hub ports: - "7903:7900" - "5553:5555" environment: - SE_EVENT_BUS_HOST=selenium-hub - SE_EVENT_BUS_PUBLISH_PORT=4442 - SE_EVENT_BUS_SUBSCRIBE_PORT=4443 - VNC_NO_PASSWORD=1 privileged: true restart: always chrome4: image: selenium/node-chrome:4.20.0-20240505 shm_size: 2gb networks: - devnet depends_on: - selenium-hub ports: - "7904:7900" - "5554:5555" environment: - SE_EVENT_BUS_HOST=selenium-hub - SE_EVENT_BUS_PUBLISH_PORT=4442 - SE_EVENT_BUS_SUBSCRIBE_PORT=4443 - VNC_NO_PASSWORD=1 privileged: true restart: always chrome_video1: image: selenium/video:ffmpeg-6.1.1-20240505 networks: - devnet volumes: - ../test_videos:/videos/ depends_on: - chrome1 environment: - DISPLAY_CONTAINER_NAME=chrome1 - SE_VIDEO_FILE_NAME=auto - SE_NODE_GRID_URL=http://selenium-hub:4444 privileged: true restart: always chrome_video2: image: selenium/video:ffmpeg-6.1.1-20240505 networks: - devnet volumes: - ../test_videos:/videos/ depends_on: - chrome2 environment: - DISPLAY_CONTAINER_NAME=chrome2 - SE_VIDEO_FILE_NAME=auto - SE_NODE_GRID_URL=http://selenium-hub:4444 privileged: true restart: always chrome_video3: image: selenium/video:ffmpeg-6.1.1-20240505 networks: - devnet volumes: - ../test_videos:/videos/ depends_on: - chrome3 environment: - DISPLAY_CONTAINER_NAME=chrome3 - SE_VIDEO_FILE_NAME=auto - SE_NODE_GRID_URL=http://selenium-hub:4444 privileged: true restart: always chrome_video4: image: selenium/video:ffmpeg-6.1.1-20240505 networks: - devnet volumes: - ../test_videos:/videos/ depends_on: - chrome4 environment: - DISPLAY_CONTAINER_NAME=chrome4 - SE_VIDEO_FILE_NAME=auto - SE_NODE_GRID_URL=http://selenium-hub:4444 privileged: true restart: always selenium-hub: image: selenium/hub:4.20.0-20240505 container_name: selenium-hub hostname: selenium ports: - "4442:4442" - "4443:4443" - "4444:4444" privileged: true restart: always networks: - devnet networks: devnet: driver: bridge ================================================ FILE: .dockerignore ================================================ # Ignore everything * # But not these files... !*.json !*.py !PipFile !Pipfile.lock !.gitignore !.dockerignore !*.toml !*.md !*.rst !LICENSE !*.schema !Makefile !dependabot.yaml !poetry.lock !behave.ini !*.Dockerfile # Or these folders... !.github !*.png !.github/ISSUE_TEMPLATE !.github/ISSUE_TEMPLATE/*.yaml !.github/workflows !.github/workflows/*.yml !uk_bin_collection !uk_bin_collection/**/* !uk_bin_collection_api_server !uk_bin_collection_api_server/**/* !wiki !wiki/**/* !custom_components __pycache__ !TO_BE_CONVERTED !.devcontainer ================================================ FILE: .github/ISSUE_TEMPLATE/COUNCIL_ISSUE.yaml ================================================ name: Council Issue description: Issue with an existing council labels: ["bug"] body: - type: input id: council attributes: label: Name of Council description: What council you were trying to use placeholder: e.g. Huntingdon District Council validations: required: true - type: textarea id: extra attributes: label: Issue Information description: What is the issue you're experiencing? How can we re-produce it? placeholder: Detailed explanation of the issue along with replication steps - type: checkboxes id: verification attributes: label: Verification description: 'Please verify that you''ve followed these steps:' options: - label: I searched for similar issues at https://github.com/robbrad/UKBinCollectionData/issues?q=is:issue and found no duplicates required: true - label: I have checked my address/postcode/UPRN works on the council's website required: true - label: I have provided a detailed explanation of the issue as well as steps to replicate the issue required: true - label: I understand that this project is run by volunteer contributors therefore completion of this issue cannot be guaranteed required: true ================================================ FILE: .github/ISSUE_TEMPLATE/COUNCIL_REQUEST.yaml ================================================ name: Council Request description: Request for a council to be added to the repository labels: ["council request"] body: - type: input id: council attributes: label: Name of Council description: What council are you wishing to be added placeholder: e.g. Huntingdon District Council validations: required: true - type: input id: postcode attributes: label: Example Address/Postcode description: Please provide a tested working example address/postcode for the council's area placeholder: e.g. PE7 3YQ validations: required: true - type: textarea id: extra attributes: label: Additional Information description: Add any other information here placeholder: Links to the councils site, information you have already gathered - type: checkboxes id: verification attributes: label: Verification description: 'Please verify that you''ve followed these steps:' options: - label: I''ve checked the [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils#contents) and verified that my council has not been added required: true - label: I''ve checked that a request for my council does not already exist in the [Issues tracker](https://github.com/robbrad/UKBinCollectionData/issues?q=is%3Aopen+is%3Aissue+label%3A"council+request") required: true - label: I have provided a tested working address/postcode/UPRN with bin collections available, as well as a link to the council''s website required: true - label: I understand that this project is run by volunteer contributors and completion depends on numerous factors - even with a request, we cannot guarantee if/when your council will get a script required: true ================================================ FILE: .github/ISSUE_TEMPLATE/HOME_ASSISTANT_CUSTOM_COMPONENT_ISSUE.yaml ================================================ name: Home Assistant Custom Component Issue description: Issue with the Home Assistant custom component labels: ["bug", "home assistant custom component"] body: - type: markdown attributes: value: If you were trying to add a specific council, please check it is listed as working [here](https://robbrad.github.io/UKBinCollectionData/3.12/) and open a [Council Issue](https://github.com/robbrad/UKBinCollectionData/issues/new/choose) instead if it's failing - type: input id: ha_version attributes: label: Home Assistant Version description: What version of Home Assistant you're running placeholder: e.g. 2023.10.3 validations: required: true - type: dropdown id: install_method attributes: label: Installation Method description: How did you install the custom component? options: - Using HACS - Manually validations: required: true - type: input id: council attributes: label: Name of Council (if relevant) description: Which council were you trying to use? placeholder: e.g. Huntingdon District Council - type: textarea id: extra attributes: label: Issue Information description: What issue are you experiencing? How can we re-produce it? placeholder: Detailed explanation of the issue along with replication steps validations: required: true - type: checkboxes id: verification attributes: label: Verification description: 'Please verify that you''ve followed these steps:' options: - label: I searched for similar issues at https://github.com/robbrad/UKBinCollectionData/issues?q=is:issue and found no duplicates required: true - label: If trying to add a specific council, I've checked it is listed as working at https://robbrad.github.io/UKBinCollectionData/3.12/ required: true - label: I have provided a detailed explanation of the issue as well as steps to replicate the issue required: true - label: I understand that this project is run by volunteer contributors therefore completion of this issue cannot be guaranteed required: true ================================================ FILE: .github/dependabot.yaml ================================================ --- version: 2 updates: - package-ecosystem: "github-actions" directory: "/" schedule: interval: daily time: "06:00" commit-message: # Prefix all commit messages with "chore: " prefix: "chore" ================================================ FILE: .github/workflows/behave_pull_request.yml ================================================ name: PR - Test Councils on: workflow_dispatch: pull_request: branches: [ "master" ] paths-ignore: - "wiki/**" - "**/*.md" - "uk_bin_collection_api_server/**" jobs: setup: name: Setup Environment runs-on: ubuntu-latest steps: - uses: actions/checkout@v6 - name: Install Poetry run: pipx install poetry==1.8.4 - uses: actions/setup-python@v6 with: python-version: 3.12 - name: Install Dependencies run: make install-dev - name: Lint JSON run: jq empty uk_bin_collection/tests/input.json - name: Get All Council Files That Have Changed id: changed-council-files uses: tj-actions/changed-files@v47 with: files: | uk_bin_collection/uk_bin_collection/councils/**.py - name: Set Council Tests Environment Variable id: set-council-tests run: | IFS=' ' read -ra FILES <<< "${{ steps.changed-council-files.outputs.all_changed_files }}" COUNCIL_TESTS="" for file in "${FILES[@]}"; do FILENAME=$(basename "$file" .py) if [ -z "$COUNCIL_TESTS" ]; then COUNCIL_TESTS="$FILENAME" else COUNCIL_TESTS="$COUNCIL_TESTS or $FILENAME" fi done echo "council_tests=$COUNCIL_TESTS" >> $GITHUB_OUTPUT outputs: council_tests: ${{ steps.set-council-tests.outputs.council_tests }} unit-tests: name: Run Unit Tests needs: setup runs-on: ubuntu-latest strategy: matrix: python-version: [3.12] poetry-version: [1.8.4] steps: - uses: actions/checkout@v6 - uses: actions/setup-python@v6 with: python-version: ${{ matrix.python-version }} - name: Install Poetry run: pipx install poetry==${{ matrix.poetry-version }} - name: Install Dependencies run: make install-dev - name: Run Unit Tests run: make unit-tests - name: Upload Test Results to Codecov uses: codecov/codecov-action@v6 with: fail_ci_if_error: false token: ${{ secrets.CODECOV_TOKEN }} file: coverage.xml parity-check: name: Parity Check needs: setup runs-on: ubuntu-latest strategy: matrix: python-version: [3.12] poetry-version: [1.8.4] steps: - uses: actions/checkout@v6 - uses: actions/setup-python@v6 with: python-version: ${{ matrix.python-version }} - name: Install Poetry run: pipx install poetry==${{ matrix.poetry-version }} - name: Install Dependencies run: make install-dev - name: Check Parity of Councils / input.json / Feature file env: repo: ${{ github.event.pull_request.head.repo.full_name || 'robbrad/UKBinCollectionData' }} branch: ${{ github.event.pull_request.head.ref || 'master' }} run: make parity-check repo="$repo" branch="$branch" integration-tests: name: Run Integration Tests needs: setup runs-on: ubuntu-latest strategy: matrix: python-version: [3.12] poetry-version: [1.8.4] services: selenium: image: selenium/standalone-chrome:latest options: --shm-size=2gb --name selenium --hostname selenium ports: - 4444:4444 steps: - uses: actions/checkout@v6 - uses: actions/setup-python@v6 with: python-version: ${{ matrix.python-version }} - name: Install Poetry run: pipx install poetry==${{ matrix.poetry-version }} - name: Install Dependencies run: make install-dev - name: Run Integration Tests env: HEADLESS: True COUNCIL_TESTS: ${{ needs.setup.outputs.council_tests }} run: make matrix=${{ matrix.python-version }} councils="${{ env.COUNCIL_TESTS }}" integration-tests continue-on-error: true - name: Upload Integration Test Results to Codecov uses: codecov/codecov-action@v6 with: fail_ci_if_error: false token: ${{ secrets.CODECOV_TOKEN }} report_type: test_results file: build/${{ matrix.python-version }}/integration-test-results/junit.xml flags: integrationtestspr name: integration-tests-pr ================================================ FILE: .github/workflows/behave_schedule.yml ================================================ name: Scheduled - Test All Councils on: workflow_dispatch: schedule: - cron: '0 0 * * *' # Nightly schedule for full test run jobs: setup: name: Setup Environment runs-on: ubuntu-latest steps: - uses: actions/checkout@v6 - name: Install Poetry run: pipx install poetry==1.8.4 - uses: actions/setup-python@v6 with: python-version: 3.12 - name: Install Dependencies run: make install-dev - name: Lint JSON run: jq empty uk_bin_collection/tests/input.json - name: Set Council Tests Environment Variable id: set-council-tests run: | COUNCIL_TESTS="" echo "council_tests=$COUNCIL_TESTS" >> $GITHUB_OUTPUT outputs: council_tests: ${{ steps.set-council-tests.outputs.council_tests }} unit-tests: name: Run Unit Tests needs: setup runs-on: ubuntu-latest strategy: matrix: python-version: [3.12] poetry-version: [1.8.4] steps: - uses: actions/checkout@v6 - uses: actions/setup-python@v6 with: python-version: ${{ matrix.python-version }} - name: Install Poetry run: pipx install poetry==${{ matrix.poetry-version }} - name: Install Dependencies run: make install-dev - name: Run Unit Tests run: make unit-tests - name: Upload Test Results to Codecov uses: codecov/codecov-action@v6 with: fail_ci_if_error: false token: ${{ secrets.CODECOV_TOKEN }} file: coverage.xml parity-check: name: Parity Check needs: setup runs-on: ubuntu-latest strategy: matrix: python-version: [3.12] poetry-version: [1.8.4] steps: - uses: actions/checkout@v6 - uses: actions/setup-python@v6 with: python-version: ${{ matrix.python-version }} - name: Install Poetry run: pipx install poetry==${{ matrix.poetry-version }} - name: Install Dependencies run: make install-dev - name: Check Parity of Councils / input.json / Feature file run: | repo=${{ github.event.pull_request.head.repo.full_name || 'robbrad/UKBinCollectionData' }} branch=${{ github.event.pull_request.head.ref || 'master' }} make parity-check repo=$repo branch=$branch integration-tests: name: Run Integration Tests needs: setup runs-on: ubuntu-latest strategy: matrix: python-version: [3.12] poetry-version: [1.8.4] services: selenium: image: selenium/standalone-chrome:latest options: --shm-size=2gb --name selenium --hostname selenium ports: - 4444:4444 steps: - uses: actions/checkout@v6 - uses: actions/setup-python@v6 with: python-version: ${{ matrix.python-version }} - name: Install Poetry run: pipx install poetry==${{ matrix.poetry-version }} - name: Install Dependencies run: make install-dev - name: Run Integration Tests env: HEADLESS: True COUNCIL_TESTS: ${{ needs.setup.outputs.council_tests }} run: make matrix=${{ matrix.python-version }} councils="${{ env.COUNCIL_TESTS }}" integration-tests continue-on-error: true - name: Upload Integration Test Results to Codecov uses: codecov/codecov-action@v6 with: fail_ci_if_error: false token: ${{ secrets.CODECOV_TOKEN }} report_type: test_results file: build/${{ matrix.python-version }}/integration-test-results/junit.xml flags: integrationtestsfullnightly name: integration-tests-full-nightly ================================================ FILE: .github/workflows/bump.yml ================================================ name: Release - Bump Version on: push: branches: [ "master" ] paths-ignore: - "wiki/**" - "**/*.md" - ".github/workflows/**" workflow_dispatch: {} jobs: bump: if: "!startsWith(github.event.head_commit.message, 'bump:')" runs-on: ubuntu-latest permissions: contents: write concurrency: bump steps: - name: Checkout uses: actions/checkout@v6 with: fetch-depth: 0 ssh-key: ${{ secrets.DEPLOY_KEY }} persist-credentials: true - name: Setup Python uses: actions/setup-python@v6 with: python-version: '3.12' cache: 'pip' - name: Cache Commitizen uses: actions/cache@v5 with: path: ~/.cache/pip key: ${{ runner.os }}-pip-commitizen-${{ hashFiles('**/pyproject.toml') }} restore-keys: | ${{ runner.os }}-pip-commitizen- - name: Install Commitizen run: pip install commitizen - name: Configure git identity run: | git config user.name "github-actions[bot]" git config user.email "41898282+github-actions[bot]@users.noreply.github.com" - name: Bump version and create tag id: bump run: | # Check if there are commits to bump if cz bump --yes --changelog --dry-run 2>&1 | grep -q "No commits found"; then echo "No version bump needed - no conventional commits since last release" echo "skip=true" >> $GITHUB_OUTPUT exit 0 fi cz bump --yes --changelog echo "version=$(cz version --project)" >> $GITHUB_OUTPUT echo "skip=false" >> $GITHUB_OUTPUT - name: Push changes and tags if: steps.bump.outputs.skip != 'true' run: | git push origin master git push origin --tags - name: Create workflow summary if: always() run: | echo "## Bump Summary" >> $GITHUB_STEP_SUMMARY if [ "${{ steps.bump.outputs.skip }}" == "true" ]; then echo "- **Status**: ⏭️ Skipped (no conventional commits)" >> $GITHUB_STEP_SUMMARY else echo "- **Status**: ✅ Success" >> $GITHUB_STEP_SUMMARY echo "- **New Version**: ${{ steps.bump.outputs.version }}" >> $GITHUB_STEP_SUMMARY echo "- **Tag Created**: ${{ steps.bump.outputs.version }}" >> $GITHUB_STEP_SUMMARY fi ================================================ FILE: .github/workflows/codeql-analysis.yml ================================================ # For most projects, this workflow file will not need changing; you simply need # to commit it to your repository. # # You may wish to alter this file to override the set of languages analyzed, # or to provide custom queries or build logic. # # ******** NOTE ******** # We have attempted to detect the languages in your repository. Please check # the `language` matrix defined below to confirm you have the correct set of # supported CodeQL languages. # name: "CodeQL" on: push: # Trigger unless only the wiki directory changed paths: - "**/**.py" - "**.py" branches: [ "master" ] pull_request: # Trigger unless only the wiki directory changed paths: - "**/**.py" - "**.py" # The branches below must be a subset of the branches above branches: [ "master" ] schedule: - cron: '36 13 * * 5' jobs: analyze: if: "!startsWith(github.event.head_commit.message, 'bump:')" name: Analyze runs-on: ubuntu-latest permissions: actions: read contents: read security-events: write strategy: fail-fast: false matrix: language: [ 'python' ] # CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ] # Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support steps: - name: Checkout repository uses: actions/checkout@v6 # Initializes the CodeQL tools for scanning. - name: Initialize CodeQL uses: github/codeql-action/init@v4 with: languages: ${{ matrix.language }} # If you wish to specify custom queries, you can do so here or in a config file. # By default, queries listed here will override any specified in a config file. # Prefix the list here with "+" to use these queries and those in the config file. # Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs # queries: security-extended,security-and-quality # Autobuild attempts to build any compiled languages (C/C++, C#, or Java). # If this step fails, then you should remove it and run the build manually (see below) - name: Autobuild uses: github/codeql-action/autobuild@v4 # ℹ️ Command-line programs to run using the OS shell. # 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun # If the Autobuild fails above, remove it and uncomment the following three lines. # modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance. # - run: | # echo "Run, Build Application using script" # ./location_of_script_within_repo/buildscript.sh - name: Perform CodeQL Analysis uses: github/codeql-action/analyze@v4 ================================================ FILE: .github/workflows/docker-image.yml ================================================ name: Build - Docker Image on: push: # Trigger unless only the wiki directory changed paths: - "uk_bin_collection_api_server/**" - ".github/workflows/docker-image.yml" branches: [ "master" ] pull_request: # Trigger unless only the wiki directory changed paths: - "uk_bin_collection_api_server/**" # The branches below must be a subset of the branches above branches: [ "master" ] workflow_dispatch: schedule: - cron: '0 0 * * 0' # This runs at 00:00 on Sunday every week jobs: build: runs-on: ubuntu-latest steps: - uses: actions/checkout@v6 - name: Publish to Registry uses: elgohr/Publish-Docker-Github-Action@v5 with: name: robbrad182/uk-bin-collection username: ${{ secrets.DOCKER_USERNAME }} password: ${{ secrets.DOCKER_API_KEY }} workdir: uk_bin_collection_api_server ================================================ FILE: .github/workflows/ha_compatibility_test.yml ================================================ name: PR - Home Assistant Compatibility Test on: push: branches: [ master, main ] paths: - 'custom_components/**' - 'pyproject.toml' pull_request: branches: [ master, main ] paths: - 'custom_components/**' - 'pyproject.toml' schedule: - cron: '0 6 * * 1' # Weekly on Monday at 6 AM UTC jobs: generate-matrix: name: Generate HA Version Matrix runs-on: ubuntu-latest outputs: matrix: ${{ steps.get-versions.outputs.matrix }} steps: - name: Get HA versions from PyPI id: get-versions run: | MATRIX=$(curl -s https://pypi.org/pypi/homeassistant/json | jq -c ' .releases | to_entries # keep only x.y.z (skip betas/devs/post) | map(select(.key | test("^[0-9]+\\.[0-9]+\\.[0-9]+$"))) # group by major.minor, keep highest patch | group_by(.key | (split(".")[:2] | join("."))) | map(max_by(.key | (split(".")[2] | tonumber)) | .key) # sort numerically and take latest 8 | sort_by(split(".") | map(tonumber)) | .[-8:] # pick python version per HA series; adjust as needed | map({ha_version: ., python_version: (if (split(".")[0] == "2025" and (split(".")[1]|tonumber) >= 2) then "3.13" else "3.12" end)}) # also test latest dev on py 3.13 | . + [{ha_version: "dev", python_version: "3.13"}] | {include: .} ') echo "matrix=$MATRIX" >> "$GITHUB_OUTPUT" test-ha-compatibility: name: Test HA ${{ matrix.ha_version }} runs-on: ubuntu-latest needs: generate-matrix strategy: fail-fast: false matrix: ${{ fromJson(needs.generate-matrix.outputs.matrix) }} steps: - name: Checkout code uses: actions/checkout@v6 - name: Determine Docker tag id: docker-tag run: | if [ "${{ matrix.ha_version }}" = "dev" ]; then echo "tag=dev" >> "$GITHUB_OUTPUT" else echo "tag=${{ matrix.ha_version }}" >> "$GITHUB_OUTPUT" fi - name: Setup HA config directory run: | mkdir -p config/custom_components config/.storage cp -r custom_components/uk_bin_collection config/custom_components/ cat > config/configuration.yaml <<'YAML' logger: default: info YAML # Create a config entry to trigger component setup cat > config/.storage/core.config_entries <<'JSON' { "version": 1, "minor_version": 1, "key": "core.config_entries", "data": { "entries": [ { "entry_id": "test_uk_bin_collection", "version": 3, "domain": "uk_bin_collection", "title": "Test Entry", "data": { "name": "Test Council", "council": "GooglePublicCalendarCouncil", "url": "https://calendar.google.com/calendar/ical/0d775884b4db6a7bae5204f06dae113c1a36e505b25991ebc27c6bd42edf5b5e%40group.calendar.google.com/public/basic.ics", "timeout": 60, "update_interval": 12, "manual_refresh_only": true }, "options": {}, "pref_disable_new_entities": false, "pref_disable_polling": false, "source": "user", "unique_id": null, "disabled_by": null } ] } } JSON - name: Start Home Assistant in Docker run: | docker run -d \ --name homeassistant \ -v $(pwd)/config:/config \ -e TZ=UTC \ ghcr.io/home-assistant/home-assistant:${{ steps.docker-tag.outputs.tag }} echo "Waiting for container to start..." sleep 5 - name: Wait for Home Assistant to boot id: boot run: | set -euo pipefail TIMEOUT=150 SECS=0 INIT_MARKER="Home Assistant initialized" FAIL=0 echo "Waiting for HA to initialize..." while (( SECS < TIMEOUT )); do LOGS=$(docker logs homeassistant 2>&1) if echo "$LOGS" | grep -q "$INIT_MARKER"; then echo "✅ HA initialized successfully" break fi sleep 1 SECS=$((SECS+1)) if (( SECS % 10 == 0 )); then echo "Waiting... ${SECS}s" fi done # Check for dependency installation and component setup LOGS=$(docker logs homeassistant 2>&1) if echo "$LOGS" | grep -q "Attempting install of uk-bin-collection"; then echo "✅ HA attempted to install uk-bin-collection dependency" fi if echo "$LOGS" | grep -Eq "(ERROR|CRITICAL).*(uk_bin_collection|custom_components\.uk_bin_collection)"; then echo "❌ Component has errors in logs:" echo "$LOGS" | grep -E "(ERROR|CRITICAL).*(uk_bin_collection|custom_components\.uk_bin_collection)" || true FAIL=1 fi # Check timeout if (( SECS >= TIMEOUT )) && ! echo "$LOGS" | grep -q "$INIT_MARKER"; then echo "❌ HA did not finish booting within ${TIMEOUT}s" FAIL=1 fi # Expose pass/fail to later steps echo "boot_failed=${FAIL}" >> "$GITHUB_OUTPUT" exit ${FAIL} - name: Save HA logs to file if: always() run: | docker logs homeassistant > home-assistant.log 2>&1 || true - name: Show HA logs if: always() run: | echo "--- Last 80 log lines ---" tail -n 80 home-assistant.log 2>/dev/null || docker logs homeassistant 2>&1 | tail -n 80 - name: Stop and remove container if: always() run: | docker stop homeassistant || true docker rm homeassistant || true - name: Upload HA log (always) if: always() uses: actions/upload-artifact@v7 with: name: ha-log-${{ matrix.ha_version }} path: home-assistant.log overwrite: true - name: Test manifest validation id: manifest run: | python <<'PY' import json, sys with open('custom_components/uk_bin_collection/manifest.json') as f: m = json.load(f) required = ['domain', 'name', 'version', 'requirements'] missing = [k for k in required if k not in m] if missing: print(f'❌ Missing required manifest fields: {missing}') sys.exit(1) print('✅ Manifest validation passed') print(f'Component version: {m.get("version")}') print(f'Requirements: {m.get("requirements")}') PY - name: Create test result summary if: always() run: | echo "## Boot Results for HA ${{ matrix.ha_version }} (Python ${{ matrix.python_version }})" >> "$GITHUB_STEP_SUMMARY" if [ "${{ steps.boot.outputs.boot_failed }}" = "0" ] && [ "${{ steps.manifest.outcome }}" = "success" ]; then echo "✅ **PASSED** – HA booted with the custom component present" >> "$GITHUB_STEP_SUMMARY" else echo "❌ **FAILED** – HA failed to boot cleanly" >> "$GITHUB_STEP_SUMMARY" echo "" >> "$GITHUB_STEP_SUMMARY" echo "- boot step failed: \`${{ steps.boot.outputs.boot_failed }}\`" >> "$GITHUB_STEP_SUMMARY" echo "- manifest step: \`${{ steps.manifest.outcome }}\`" >> "$GITHUB_STEP_SUMMARY" echo "" >> "$GITHUB_STEP_SUMMARY" echo "See the uploaded **ha-log** artifact for details." >> "$GITHUB_STEP_SUMMARY" fi compatibility-report: name: Generate Compatibility Report runs-on: ubuntu-latest needs: [generate-matrix, test-ha-compatibility] if: always() steps: - name: Checkout code uses: actions/checkout@v6 - name: Create compatibility report run: | echo "# Home Assistant Compatibility Report" > report.md echo "" >> report.md echo "Matrix tested: \`${{ needs.generate-matrix.outputs.matrix }}\`" >> report.md echo "Last updated: $(date -u +"%Y-%m-%d %H:%M:%S UTC")" >> report.md cat report.md >> "$GITHUB_STEP_SUMMARY" ================================================ FILE: .github/workflows/hacs_validation.yml ================================================ name: PR - Validate HACS on: push: pull_request: schedule: - cron: "0 0 * * *" jobs: hassfest_validation: name: HassFest Validation runs-on: "ubuntu-latest" steps: - uses: "actions/checkout@v6" - uses: home-assistant/actions/hassfest@master hacs: name: HACS Action Validation runs-on: "ubuntu-latest" steps: - name: HACS Action uses: "hacs/action@main" with: category: "integration" ================================================ FILE: .github/workflows/lint.yml ================================================ name: PR - Lint Commit Messages on: push: # The branches below must be a subset of the branches above branches: [ "master" ] pull_request: # The branches below must be a subset of the branches above branches: [ "master" ] jobs: # Make sure commit messages follow the conventional commits convention: # https://www.conventionalcommits.org commitlint: if: "!startsWith(github.event.head_commit.message, 'bump:')" name: Lint Commit Messages runs-on: ubuntu-latest steps: - uses: actions/checkout@v6 with: fetch-depth: 0 - run: "echo \"export default {extends: ['@commitlint/config-conventional'], rules: { 'subject-case': [0], 'body-max-line-length': [0], 'footer-max-line-length': [0] }}\" > commitlint.config.mjs" - uses: wagoid/commitlint-github-action@v6 ================================================ FILE: .github/workflows/release.yml ================================================ name: Release - Publish to PyPI on: push: tags: - '*' jobs: release: runs-on: ubuntu-latest permissions: contents: write id-token: write steps: - name: Checkout uses: actions/checkout@v6 - name: Setup Python uses: actions/setup-python@v6 with: python-version: '3.12' cache: 'pip' - name: Install Poetry uses: abatilo/actions-poetry@v4.0.0 with: poetry-version: '1.8.4' - name: Cache Poetry dependencies uses: actions/cache@v5 with: path: ~/.cache/pypoetry key: ${{ runner.os }}-poetry-${{ hashFiles('**/poetry.lock') }} restore-keys: | ${{ runner.os }}-poetry- - name: Set release version run: echo "RELEASE_VERSION=${GITHUB_REF#refs/*/}" >> $GITHUB_ENV - name: Verify version matches tag run: | POETRY_VERSION=$(poetry version -s) if [ "$POETRY_VERSION" != "${{ env.RELEASE_VERSION }}" ]; then echo "Error: Poetry version ($POETRY_VERSION) doesn't match tag (${{ env.RELEASE_VERSION }})" exit 1 fi - name: Build package run: poetry build - name: Create GitHub release uses: ncipollo/release-action@v1 with: tag: ${{ env.RELEASE_VERSION }} generateReleaseNotes: true artifacts: "dist/*" token: ${{ secrets.GITHUB_TOKEN }} - name: Publish to PyPI uses: nick-fields/retry@v4 with: timeout_minutes: 5 max_attempts: 3 retry_wait_seconds: 30 command: | poetry config pypi-token.pypi "${{ secrets.PYPI_API_KEY }}" poetry publish - name: Create workflow summary if: always() run: | echo "## Release Summary" >> $GITHUB_STEP_SUMMARY echo "- **Version**: ${{ env.RELEASE_VERSION }}" >> $GITHUB_STEP_SUMMARY echo "- **Status**: ${{ job.status }}" >> $GITHUB_STEP_SUMMARY if [ "${{ job.status }}" == "success" ]; then echo "- **PyPI**: https://pypi.org/project/uk-bin-collection/${{ env.RELEASE_VERSION }}/" >> $GITHUB_STEP_SUMMARY echo "- **GitHub Release**: https://github.com/${{ github.repository }}/releases/tag/${{ env.RELEASE_VERSION }}" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY echo "✅ Release published successfully!" >> $GITHUB_STEP_SUMMARY else echo "" >> $GITHUB_STEP_SUMMARY echo "❌ Release failed - check logs above" >> $GITHUB_STEP_SUMMARY fi docker: needs: release runs-on: ubuntu-latest steps: - name: Checkout uses: actions/checkout@v6 - name: Login to Docker Hub uses: docker/login-action@v4 with: username: ${{ secrets.DOCKER_USERNAME }} password: ${{ secrets.DOCKER_API_KEY }} - name: Build and push Docker image uses: docker/build-push-action@v7 with: context: ./uk_bin_collection_api_server push: true tags: | robbrad182/uk-bin-collection:${{ github.ref_name }} robbrad182/uk-bin-collection:latest ================================================ FILE: .github/workflows/rollback-release.yml ================================================ name: Release - Rollback on: workflow_dispatch: inputs: version: description: 'Version to rollback (e.g., 0.155.0)' required: true type: string delete_pypi: description: 'Also yank from PyPI? (cannot delete, only yank)' required: false type: boolean default: false jobs: rollback: runs-on: ubuntu-latest permissions: contents: write steps: - name: Checkout uses: actions/checkout@v6 with: fetch-depth: 0 ssh-key: ${{ secrets.DEPLOY_KEY }} - name: Validate version format run: | if ! [[ "${{ inputs.version }}" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]; then echo "❌ Invalid version format. Use X.Y.Z (e.g., 0.155.0)" exit 1 fi echo "✅ Version format valid: ${{ inputs.version }}" - name: Check if release exists id: check run: | if gh release view ${{ inputs.version }} > /dev/null 2>&1; then echo "exists=true" >> $GITHUB_OUTPUT echo "✅ Release ${{ inputs.version }} exists" else echo "exists=false" >> $GITHUB_OUTPUT echo "⚠️ Release ${{ inputs.version }} not found" fi env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} - name: Delete GitHub Release if: steps.check.outputs.exists == 'true' run: | echo "🗑️ Deleting GitHub release ${{ inputs.version }}..." gh release delete ${{ inputs.version }} --yes echo "✅ GitHub release deleted" env: GH_TOKEN: ${{ secrets.GITHUB_TOKEN }} - name: Delete Git Tag run: | if git rev-parse ${{ inputs.version }} >/dev/null 2>&1; then echo "🗑️ Deleting git tag ${{ inputs.version }}..." git push origin :refs/tags/${{ inputs.version }} echo "✅ Git tag deleted" else echo "⚠️ Tag ${{ inputs.version }} not found locally" fi - name: Setup Python (if PyPI yank requested) if: inputs.delete_pypi == true uses: actions/setup-python@v6 with: python-version: '3.12' - name: Install Poetry (if PyPI yank requested) if: inputs.delete_pypi == true uses: abatilo/actions-poetry@v4.0.0 with: poetry-version: '1.8.4' - name: Yank from PyPI if: inputs.delete_pypi == true run: | echo "⚠️ Yanking version ${{ inputs.version }} from PyPI..." echo "Note: This marks the release as unsuitable for installation but doesn't delete it" poetry config pypi-token.pypi "${{ secrets.PYPI_API_KEY }}" # PyPI doesn't support yanking via poetry directly, need to use twine pip install twine # Note: You'll need to manually yank via PyPI web interface or use: # twine upload --repository pypi --skip-existing dist/* echo "⚠️ PyPI yanking must be done manually at: https://pypi.org/manage/project/uk-bin-collection/releases/" echo "Go to the release and click 'Options' -> 'Yank release'" - name: Create workflow summary if: always() run: | echo "## Rollback Summary" >> $GITHUB_STEP_SUMMARY echo "- **Version**: ${{ inputs.version }}" >> $GITHUB_STEP_SUMMARY echo "- **GitHub Release**: ${{ steps.check.outputs.exists == 'true' && '✅ Deleted' || '⚠️ Not found' }}" >> $GITHUB_STEP_SUMMARY echo "- **Git Tag**: Deleted from remote" >> $GITHUB_STEP_SUMMARY if [ "${{ inputs.delete_pypi }}" == "true" ]; then echo "- **PyPI**: ⚠️ Manual yank required" >> $GITHUB_STEP_SUMMARY echo "" >> $GITHUB_STEP_SUMMARY echo "### Next Steps for PyPI" >> $GITHUB_STEP_SUMMARY echo "1. Go to https://pypi.org/manage/project/uk-bin-collection/releases/" >> $GITHUB_STEP_SUMMARY echo "2. Find version ${{ inputs.version }}" >> $GITHUB_STEP_SUMMARY echo "3. Click 'Options' -> 'Yank release'" >> $GITHUB_STEP_SUMMARY fi echo "" >> $GITHUB_STEP_SUMMARY echo "### ⚠️ Important Notes" >> $GITHUB_STEP_SUMMARY echo "- The version bump commit still exists in git history" >> $GITHUB_STEP_SUMMARY echo "- To fully rollback, you may need to revert the bump commit" >> $GITHUB_STEP_SUMMARY echo "- Users who already installed this version will keep it" >> $GITHUB_STEP_SUMMARY - name: Notify completion run: | echo "✅ Rollback completed for version ${{ inputs.version }}" echo "Check the summary tab for details" ================================================ FILE: .github/workflows/validate-release-ready.yml ================================================ name: PR - Validate Release Ready on: workflow_dispatch: pull_request: branches: [ "master" ] types: [opened, synchronize, reopened] jobs: validate: name: Validate Release Prerequisites runs-on: ubuntu-latest steps: - uses: actions/checkout@v6 with: fetch-depth: 0 - uses: actions/setup-python@v6 with: python-version: '3.12' cache: 'pip' - name: Cache Poetry uses: actions/cache@v5 with: path: ~/.local/pipx key: ${{ runner.os }}-pipx-poetry-1.8.4 restore-keys: | ${{ runner.os }}-pipx-poetry- - name: Install Poetry run: pipx install poetry==1.8.4 - name: Validate pyproject.toml run: poetry check - name: Check for conventional commits uses: wagoid/commitlint-github-action@v6 with: configFile: commitlint.config.mjs ================================================ FILE: .github/workflows/wiki.yml ================================================ name: Deploy - Wiki on: push: # Trigger only when wiki directory changes paths: - "wiki/**" - "uk_bin_collection/tests/input.json" branches: [ "master" ] pull_request: # Trigger only when wiki directory changes paths: - "wiki/**" - "uk_bin_collection/tests/input.json" # The branches below must be a subset of the branches above branches: [ "master" ] jobs: deploy-wiki: # Only run on main branch push (e.g. after pull request merge). if: github.event_name == 'push' runs-on: ubuntu-latest environment: wiki steps: - uses: actions/checkout@v6 - uses: actions/setup-python@v6 with: python-version: '3.12' - name: Run image uses: abatilo/actions-poetry@v4.0.0 with: poetry-version: '1.8.4' - name: Install run: make install - name: Update Councils.md from input.json run: make update-wiki - name: Commit and Push Wiki changes run: | git config --global user.name "Wiki GitHub Action" git config --global user.email "action@github.com" git add wiki git commit -m "docs: Update Councils.md from input.json" git push continue-on-error: true - name: Deploy Wiki Changes uses: Andrew-Chen-Wang/github-wiki-action@v5 with: # Make sure WIKI_DIR ends with / as action uses rsync path: wiki/ ignore: "generate_wiki.py" ================================================ FILE: .gitignore ================================================ # Ignore everything * # But not these files... !*.json !*.py !PipFile !Pipfile.lock !.gitignore !.dockerignore !*.toml !*.md !*.rst !LICENSE !*.schema !Makefile !dependabot.yaml !poetry.lock !behave.ini !*.Dockerfile !docker-compose.yml !.vscode/launch.json !pytest.ini # Or these folders... !docs !.github !*.png !.github/ISSUE_TEMPLATE !.github/ISSUE_TEMPLATE/*.yaml !.github/workflows !.github/workflows/*.yml !uk_bin_collection !uk_bin_collection/**/* !uk_bin_collection_api_server !uk_bin_collection_api_server/**/* !wiki !wiki/**/* !custom_components !custom_components/**/*/ !custom_components/uk_bin_collection/services.yaml __pycache__ !TO_BE_CONVERTED !.devcontainer uk_bin_collection/.DS_Store uk_bin_collection/uk_bin_collection/.DS_Store !scripts !.kiro ISSUE_RESOLUTION_PROGRESS.md ================================================ FILE: CHANGELOG.md ================================================ ======= ## 0.165.0 (2026-03-28) ### Feat - Lancaster City - support food waste collection (#1895) - support lancaster city food waste collection - North Northamptonshire - add food caddy bin type support (#1894) - add support for food caddy bin type in North Northamptonshire Council scraper ### Fix - expose errors in lancaster city date parsing - correct casing for food caddy bin type in North Northamptonshire Council scraper - Herefordshire Council - incorrectly picking up non-date string (#1888) - Herefordshire Council incorrectly picking up non-date string - EalingCouncil/LondonBoroughEaling - use collectionDate not collectionDateString (#1886) - EalingCouncil/LondonBoroughEaling: Use collectionDate not collectionDateString ### Refactor - ChorleyCouncil - use requests instead of Selenium (#1891) ## 0.164.0 (2026-03-14) ### Feat - NewhamCouncil - add food waste collection scraping ### Fix - update address selection XPath for BroxbourneCouncil - nuneaton and bedworth - nuneaton and bedworth - NewhamCouncil - correct datetime parsing from DD/MM/YYYY to MM/DD/YYYY - NewhamCouncil - disable SSL verification to resolve certificate verification errors - updated ID's for multiple elements that had changed - Broxtowe Borough Council - #1872 - Broxtowe Borough Council - Adding North Warwickshire Borough Council - #1869 - Adding North Warwickshire Borough Council - Bath and North East Somerset - #1876 - Bath and North East Somerset - Hinckley & Bosworth Council - #1879 - Hinckley & Bosworth Council - Midlothian Council - #1880 Midlothian Council - Merton Council - #1868 - Merton Council - Eastleigh Borough Council - #1867 - Eastleigh Borough Council - London Borough Havering - #1863 - London Borough Havering - Leeds City Council - #1864 - Leeds City Council - North East Derbyshire District Council - #1861 - North East Derbyshire District Council - Cumberland Council - #1858 - Cumberland Council - Barking & Dagenham - #1855 - Barking & Dagenham - Redcar and Cleveland Council - #1848 - Redcar and Cleveland Council - Wakefield City Council - #1853 - Wakefield City Council - Bromley Borough Council - #1851 Bromley Borough Council - Mid Suffolk District Council - #1845 - Mid Suffolk District Council - Powys Council - #1846 - Powys Council - LondonBoroughHammersmithandFulham - LondonBoroughHammersmithandFulham - HarboroughDistrictCouncil - HarboroughDistrictCouncil - Adding Hammersmith & Fulham - #1504 - Adding Hammersmith & Fulham - Harborough District Council - #1831 - Harborough District Council - London Borough Redbridge - #1836 - fix: London Borough Redbridge ## 0.163.0 (2026-02-02) ### Feat - #1686 GosportBoroughCouncil - Add new council using Supatrak API - #1593 #1618 #1794 - Add Causeway Coast and Glens, Rossendale Borough, North Warwickshire to GooglePublicCalendarCouncil ### Fix - #1831 HarboroughDistrictCouncil - use data instead of json, suppress SSL warnings, improve parsing - #1831 HarboroughDistrictCouncil - add SSL bypass and better error handling for 502 errors - #1836 LondonBoroughRedbridge - updated selectors for redesigned website ## 0.162.7 (2026-02-02) ### Fix - resolve issues #1776, #1780, #1782 - Camden, NE Derbyshire, Newport - Broken councils - remove URLs from translation strings for HACS compliance - **CumberlandCouncil**: remove obsolete duplicate entries - **AmberValleyBoroughCouncil**: filter invalid date 01/01/0001 - Kingston parser for HTML format change with explicit error handling - Kingston-upon-Thames website HTML format change - UttlesfordDistrictCouncil use color names for bin types - EastHertsCouncil handle empty NextDate values - UttlesfordDistrictCouncil incorrect bin types due to wrong alt text - compare dates without time component in UttlesfordDistrictCouncil - UttlesfordDistrictCouncil hardcoded year 2024 - Wyre Forest District Council - #1835 - Wyre Forest District Council - Babergh District Council - #1783 - Babergh District Council - Mid Suffolk District Council - #1746 - Mid Suffolk District Council - Waverley Borough Council - #1834 - Waverley Borough Council - London Borough Sutton - #1830 - London Borough Sutton - Bolton Council - #1792 - Bolton Council - Coventry City Council - #1808 - Coventry City Council - Slough Borough Council - #1822 - Slough Borough Council - Bromley Borough Council - #1829 - Bromley Borough Council - Burnley Borough Council - #1820 - Burnley Borough Council ### Refactor - add explicit datetime import in UttlesfordDistrictCouncil ## 0.162.6 (2026-01-14) ### Fix - FolkestoneandHytheDistrictCouncil.py - CastlepointDistrictCouncil - Folkestone and Hythe District Council - Castlepoint District Council - #1803 #1793 Castlepoint District Council - Folkstone and Hythe District Council - #1760 - Folkstone and Hythe District Council - Newark and Sherwood District Council - #1777 - fix: Newark and Sherwood District Council - South Lanarkshire Council - #1771 - South Lanarkshire Council - Renfrewshire Council - #1500 - Renfrewshire Council ## 0.162.5 (2025-12-08) ### Fix - West Oxfordshire - West Oxfordshire - Adur & Worthing (#1454), Hillingdon (#1680) ## 0.162.4 (2025-12-08) ### Fix - Cumberland Council ## 0.162.3 (2025-12-08) ### Fix - Islington, Worcester ## 0.162.2 (2025-12-07) ### Fix - Broken councils ## 0.162.1 (2025-12-07) ## 0.162.0 (2025-12-07) ### Feat - Add support for Isle of Anglesey County Council - replace Selenium with Cloud9 mobile API for NHDC bin collection data - Adding Harlow Council - #1639 Adding Harlow Council - Adding Blackpool Council - #1640 Adding Blackpool Council ### Fix - add User-Agent header to KingsLynnandWestNorfolkBC scraper - **southgloucestershirecouncil**: check none instead of empty string - Treat missing response data as an error to prevent silent failure - address latest CodeRabbit feedback - address CodeRabbit feedback - Fix input.json data - fix UPRN param encoding for SouthamptonCityCouncil - Replace loop variable for clarity in North Hertfordshire parsing logic - Let requests handle query param encoding - Address edge case in address splitting - Handle edge case for date parsing validation - Don't bail on invalid date format - Add comment explaining where auth header came from - Improve error handling for mobile API requests and JSON parsing - Use constant for mobile API container count - Improve error handling for mobile API JSON response - Use named imports from common - Add sorting key for bin collections using parsed datetime - Improve error handling for collection date parsing - Use `line.strip()` in list comp - Update comment to match behaviour - Amend list comprehension variable name to avoid shadowing - Remove unused variable assignment - Perform postcode/paon string manipulation after checking truthiness - WiltshireCouncil.py - Rushmoor Council - #1724 - Rushmoor Council - Wiltshire Council - #1689 - Wiltshire Council - Halton Borough Council - #1209 Halton Borough Council - Northumberland Council - #1711 - Northumberland Council - Requires 12 digit UPRN - South Lanarkshire Council - #1712 - South Lanarkshire Council - Argyll and Bute Council - #1718 - Argyll and Bute Council - Thurrock Council - #1720 - Thurrock Council - Mid Sussex - #1721 Mid Sussex - Chelmsford City Council - #1707 - #1706 - London Borough of Lambeth - #1706 - London Borough of Lambeth - Fife Council - Armagh Banbridge Craigavon Council - #1622 ### Refactor - Adjust return payload aggregation logic ## 0.161.0 (2025-11-08) ### Feat - Dumfries and Galloway Council ### Fix - Herefordshire Council - Herefordshire Council - Southampton City Council - #1698 - Newport City Council - #1229 - Middlesborough Council - #1382 - Removed the need for Selenium - Boston Borough Council - #1690 - Chelmsford City Council - #1688 - BREAKING CHANGE - Derby City Council - #1676 - London Borough of Hounslow - #1683 - Brighton & Hove - #1685 - New URL - Brighton & Hove - #1685 - New URL - Hart District Council - #1625 - London Borough of Harrow - #1621 - Wokingham Borough Council - #1641 - Norwich City Council - #1653 - Rochdale Council - #1675 fix: #1259 - **tendring**: ignore stale 'Next collection' dates older than today - **tendring**: restore headless=True default and silence unused lambda arg for lint - **tendring**: use 'Next collection' column; fix imports/strings/waits; robust iframe/cookie handling - **tendring**: read 'Next collection' column; harden cookie/iframe handling; normalise dd/MM/YYYY ## 0.160.1 (2025-10-21) ### Fix - test valley wrong dates - Remove merge conflict - remove merge conflict - remove merge conflict message ## 0.160.0 (2025-10-21) ### Feat - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant - adding tests to ensure releases work with Home assistant ## 0.159.3 (2025-10-20) ### Fix - Broken dependancies ## 0.159.2 (2025-10-19) ### Fix - allow pillow 11.x to fix home assistant compatibility ## 0.159.1 (2025-10-18) ### Fix - Add null checks to prevent AttributeError when collection date text is not found. Introduces extract_collection_date() helper that safely extracts dates and returns None if parsing fails, allowing the scraper to gracefully skip missing collection types. ## 0.159.0 (2025-10-18) ### Feat - Modernize South Kesteven scraper with requests-based approach and OCR ### Fix - **feedbank**: address improvements suggested in PR review - Update NorthTynesideCouncil to reflect changes to website and extract schedule from the UPRN linked page ## 0.158.1 (2025-10-18) ### Fix - remove merge conflict annotations and delete old code - click on the submit button instead of sending ENTER ## 0.158.0 (2025-10-11) ### Feat - workflow overhaul - workflow overhaul ## 0.157.0 (2025-10-11) ### Feat - Create tag-on-merge.yml - Update bump.yml - fix bump.yml - Update TorbayCouncil.py - Update bump.yml - fix release pipeline bump.yml - fix Torbay ### Fix - Update AberdeenCityCouncil.py - Update TorbayCouncil.py - Update TorbayCouncil.py - Update TorbayCouncil.py - Update URL for NewForestCouncil - New URL and page for wheelie bins - improve Mid Suffolk District Council holiday handling with dynamic bank holiday detection - Oxford now rejects the "Requests" default user agent - #1557 - Adding East Dunbartonshire - #1557 - Adding East Dunbartonshire - #1569 - Somerset Council - #1569 - Somerset Council - #1559 - Newport City Council - #1559 - Newport City Council - #1574 - Test Valley Borough Council - #1574 - Test Valley Borough Council - #1566 South Gloucestershire Council ## 0.154.0 (2025-09-21) ### Feat - handle changes to northumberland council website - modify input for NorthumberlandCouncil to accept uprn instead of house number, and use new page structure ### Fix - the cookie banner is not optional - #1570 - Slough Borough Council - #1570 - Slough Borough Council - #1520 - Erewash Borough Council - #1520 - Erewash Borough Council - #1554 - Folkestone and Hythe District Council - #1554 - Folkestone and Hythe District Council - #1604 - West Berkshire Council - #1604 - West Berkshire Council - #1606 - Brighton and Hove City Council - #1606 - Brighton and Hove City Council - #1565 - BCP Council - #1565 - BCP Council - #1571 - Castle Point District Council - #1571 - Castle Point District Council - #1584 - NorthHertfordshireDistrictCouncil - #1584 - NorthHertfordshireDistrictCouncil - #1599 - #1599 - Basingstoke Council - #1587 - #1587 - Hartlepool Borough Council - #1588 - #1588 Glasgow City Council - #1591 - #1591 Rushmoor Council ## 0.153.0 (2025-09-02) ### Feat - Change buckinghamshire council to get data from endpoint ### Fix - 1573 Update Bolton council URL - East Herts Council - #1575 - Runnymede Borough Council - #1513 - Wiltshire Council - #1533 - Staffordshire Moorlands District Council - #1535 - Ipswich Borough Council - #1548 - North East Lincs - Hinckley and Bosworth Borough Council - Nuneaton Bedworth Borough Council - #1514 - Lichfield District Council - 1549 ## 0.152.11 (2025-08-25) ### Feat - fix releases process ### Fix - date extraction in RochfordCouncil data parsing - parsing error in BH selenium - **hacs**: respect the headless option ### Refactor - **hacs**: improve build_ukbcd_args with formatter functions ## 0.152.10 (2025-08-04) ### Fix - Gateshead and East Lothian - Enfield and Broxbourne - East Herts - FermanaghOmaghDistrictCouncil - Enfield and Broxbourne - East Herts ## 0.152.9 (2025-08-03) ### Fix - Cotswald and coventry - Fixing multiple broken councils - multiple broken councils ## 0.152.8 (2025-07-26) ### Fix - Add headers to request for Swindon Borough Council - Add headers to requests for Royal Borough of Greenwich Fixes #1496 by ensuring that the requests are not rejected due to lack of headers. - **MidlothianCouncil**: add request headers to resolve 403 Forbidden ## 0.152.7 (2025-07-01) ### Fix - maidstone selenium fix ## 0.152.6 (2025-06-18) ### Fix - removed In Progress from date - removed a degub print statement - **RugbyBoroughCouncil**: Amended parsed date from full to abbreviated month date, may worked but jun and jul did not - **RugbyBoroughCouncil**: Amended parsed date - Reworked Cumberland Council to cater for postcode addition - **OxfordCityCouncil**: Fixed Oxford City Council parsing dues to changes in output from the website ## 0.152.5 (2025-06-07) ### Fix - South Ribble and version pinning issues for input.json ## 0.152.4 (2025-06-07) ### Fix - **SouthRibble**: Corrected Date formatting issue - **SouthRibble**: Resolved South Ribble without selenium ## 0.152.3 (2025-06-04) ### Fix - NorthHertfordshire selenium script - Adur council - Eastleigh date fix - removed duplicates in BradfordMDC ## 0.152.2 (2025-06-04) ### Fix - Update Makefile - Update CheshireEastCouncil.py - Github action to handle branch name with parentheses ## 0.152.1 (2025-05-15) ### Fix - Update to fix North Somerset - Glasgow SSL bypass - more robust Northumberland - updated Eastleigh input.json - Eastleigh cloudflare fix - converted collection datetimes into dates for BH parsing. - Eastleigh cloudflare fix - Eastleigh cloudflare fix - added check_uprn to simplified councils - simplified Swindon - simplified East Devon - simplified Dover - Simplified Dartford - simplified Cheshire East - simplified Charnwood input.json - improved Charnwood - Adur Worthing fix - Chorley simplification - Bexley simplification - added URL to Torbay script - Guildford fixes - reworked Maidstone - maidstone input.json - Croydon selenium version - Stoke date-time fix ## 0.152.0 (2025-05-02) ### Feat - Added Fermanagh Omagh - Added Twekesbury - added Slough council - Added Argus Council - added Angus to input.json ### Fix - Chichester now only requires postcode and house number - Broadland now only requires postcode and house number - Barking now only requires postcode and house number - Brighton now only requires postcode and house number - ensured all bins for this council - added skip_get_url to hyndburn ## 0.151.0 (2025-04-27) ### Feat - version bump ### Fix - more robust brent date handling - input.json requires web_driver - Rugby fix ## 0.150.0 (2025-04-27) ### Feat - added melton - added pembrokeshire ### Fix - added melton - processed all bins for Moray ## 0.148.6 (2025-04-27) ## 0.148.5 (2025-04-27) ### Fix - output check - parsed bin info - selenium navigation - input.json changes ## 0.148.4 (2025-04-27) ### Fix - used canonical 'nice name' ## 0.148.3 (2025-04-25) ### Fix - working hyndburn - hyndburn input.json ## 0.148.2 (2025-04-24) ### Fix - Update docker-compose.yml - updated input.json - cloudflare fix - switch to selenium method - simplified blackburn ## 0.148.1 (2025-04-22) ### Fix - added bank holiday offsets. - added bank holiday offsets. ## 0.148.0 (2025-04-19) ### Feat - adding Wrexham and #1046 Horsham councils ### Fix - Argyll and Bute council #1053 ## 0.147.2 (2025-04-18) ### Fix - wait for element to be clickable ## 0.147.1 (2025-04-18) ### Fix - #1351 - moved geopandas to petry dev ## 0.147.0 (2025-04-18) ### Feat - add council tests results map ## 0.146.2 (2025-04-18) ### Fix - adding map checking and matching ## 0.146.1 (2025-04-18) ### Fix - more robust bank holiday handling ## 0.146.0 (2025-04-18) ### Feat - #1342 Adding Includes Trafford, Clackmannanshire, Havant, North Warwickshire, Newry Mourne and Down, East Dunbartonshire, Pendle, Torfaen, East Hampshire, Ribble Valley, Brentwood, Isle of Wight, Westmorland and Furness, Derry and Strabane, and Norwich. Google Cal support for PDF councils via ICS file ### Fix - Black reformatting ## 0.145.0 (2025-04-18) ### Feat - Adding PDF councils ## 0.144.4 (2025-04-18) ### Fix - Bristol #1275 ## 0.144.3 (2025-04-17) ### Fix - better address for input.json - bank holiday overrides - more robust address searching - simple parsing done - Selenium navigation ## 0.144.2 (2025-04-17) ### Fix - knowsley - knowsley - knowsley - knowsley - KnowsleyMBCouncil.py - #1220 adding Mid Ulster District Council ## 0.144.1 (2025-04-17) ### Fix - fix Sandwell garden waste collection date ## 0.144.0 (2025-04-17) ### Feat - added great yarmouth ## 0.143.6 (2025-04-17) ### Fix - Renfrewshire Council ## 0.143.5 (2025-04-17) ### Fix - Google Cal ## 0.143.4 (2025-04-17) ### Fix - Google Cal ## 0.143.3 (2025-04-15) ### Fix - #1301 Fix Leeds Council ## 0.143.2 (2025-04-15) ### Fix - #1301 Fix Leeds Council ## 0.143.1 (2025-04-15) ### Fix - Set the bin_type when different day ## 0.143.0 (2025-04-13) ### Fix - corrected url in input.json - fixed input.json - parsed Barking Dagenham collection information - selenium navigation Barking ## 0.142.0 (2025-04-13) ### Feat - Added Stirling Council ### Fix - typo in input.json ## 0.141.4 (2025-04-13) ### Fix - #1304 - sesnors goes to unknown if the data is blank from councils who are less reliable ## 0.141.3 (2025-04-13) ### Fix - Newham council ## 0.141.2 (2025-04-13) ### Fix - Newham council - Newham council ## 0.141.1 (2025-04-12) ### Fix - missing finally block on selenium tests ## 0.141.0 (2025-04-12) ### Feat - #1185 Adding PeterboroughCity Council ## 0.140.0 (2025-04-11) ### Feat - Added Broadland District Council ### Fix - cleanup of council file - added Broadland to input.json ## 0.139.0 (2025-04-07) ### Feat - adding #1037 - adding #1032 North Devon Count Council ### Fix - #1296 Forest of dean - 939 adding South Holland District Council - Lincolnshire UK ## 0.138.1 (2025-04-05) ### Fix - Walhtam forest council - revert previous changes ## 0.138.0 (2025-04-05) ### Feat - Adding Hastings Borough Council - Adding Fylde Council ### Fix - #1249 - #1039 fix: #1181 fix: #1266 fix: #1274 - Gloucester City Council - #1282 - Mid Devon Council - #1277 fix: #1287 - West Oxfordshire Council - #1290 ## 0.137.0 (2025-04-05) ### Feat - #816 adding trafford council ## 0.136.0 (2025-03-24) ### Feat - Adding Southampton City Council - Adding Cambridge City Council - Adding Spelthorne Borough Council ### Fix - #1057 - #1264 - #1270 - Bexley Council - #1256 - HinckleyandBosworthBoroughCouncil - #1207 - Hackney Council - #1230 - Castlepoint District Council - #1252 - Canterbury City Council - #1254 ## 0.135.4 (2025-03-24) ### Fix - parse scheduleCodeWorkflowIDs instead of scheduleCodeWorkflowID for Hackney Council ## 0.135.3 (2025-02-23) ## 0.135.2 (2025-02-19) ### Fix - North Yorkshire - multiple bins on a day ## 0.135.1 (2025-02-18) ### Fix - devcontainer ## 0.135.0 (2025-02-17) ### Fix - #833 adding Middlesbrough and check script for Selenium - Cotswold District Council - #1238 - Leeds City Council - #1222 ## 0.134.3 (2025-02-15) ### Fix - Update input.json - 1235 Councils missing Selenium in input.json ## 0.134.2 (2025-02-15) ### Fix - 1232 East herts missing Selenium url in input.json - Derbyshire Dales District Council - Conwy County Borough - Sunderland City Council - #1219 - Tendring District Council - #1221 ## 0.134.1 (2025-02-11) ### Fix - Cheltenham Borough Council - #1061 ## 0.134.0 (2025-02-07) ### Feat - Ipswich Borough Council - trying different address - Ipswich Borough Council - correcting param name in input.json - Ipswich Borough Council - added input.json values and refactored code - Ipswich Borough Council - initial implementation - Adding Runnymede Borough Council - Adding Cherwell District Council - Adding Epsom and Ewell Borough Council - Adding Redcar and Cleveland Council - Adding Amber Valley Borough Council - Adding Bolsover Council ### Fix - #1214 - #923 - #895 - #841 - #903 - #990 - Torridge District Council - #1204 - Neath Port Talbot - #1213 ## 0.133.0 (2025-02-02) ### Feat - adding manual refresh ## 0.132.0 (2025-02-02) ### Feat - adding manual refresh ## 0.131.0 (2025-02-02) ### Feat - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding unit tests for the new manual refresh - adding manual refresh control ## 0.130.1 (2025-01-30) ### Fix - slow councils ## 0.130.0 (2025-01-29) ### Feat - Add Herefordshire Council (closes: #1011) ### Fix - Fix spacing in wiki name ## 0.129.0 (2025-01-29) ### Fix - input.json - input.json ## 0.128.6 (2025-01-29) ### Fix - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting ## 0.128.5 (2025-01-29) ### Feat - Adding East Staffordshire Borough Council ### Fix - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update CheshireEastCouncil.py - Adding East Lothian Council - #1171 - #1052 fix: #1083 ## 0.128.4 (2025-01-28) ### Feat - Adding Boston Borough Council ### Fix - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Leicester City Council - #1178 - Cardiff Council - #1175 - Newcastle City Council - #1179 - #1180 - Midlothian Council - #1192 - Adding Next Page support ## 0.128.3 (2025-01-28) ### Fix - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml ## 0.128.2 (2025-01-28) ### Fix - Add communal recycling and communal rubbish - Add garden waste to Merton Council ## 0.128.1 (2025-01-28) ### Fix - Update AberdeenshireCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml ## 0.128.0 (2025-01-28) ### Feat - implement Medway Council (#1021) ### Fix - Forgot to include skip_get_url ## 0.127.4 (2025-01-25) ### Fix - NewForestCouncil ## 0.127.3 (2025-01-16) ### Fix - Swale Borough Council - #1139 - Vale of White Horse - #1156 - South Oxfordshire Council - #1158 - Surrey Heath Borough Council - #1164 - Carmarthenshire County Council - #1167 - Glasgow City Council - #1166 ## 0.127.2 (2025-01-13) ### Fix - Update bin type to be the full string ## 0.127.1 (2025-01-10) ### Fix - Use visibility of list rather than existence - Update Rushcliffe Borough Council input elements and flow - Merton Council - NewarkAndSherwoodDC - Rushcliffe Borough Council - Powys Council - Staffordshire Moorlands District Council - Stroud District Council - Vale of Glamorgan Council - West Oxfordshire District Council ## 0.127.0 (2025-01-07) ### Feat - Adding Oadby And Wigston Borough Council - Add Gwynedd Council - Adding Denbighshire Council - Adding Dundee City Council - Adding Brent Council - Adding West Dunbartonshire Council - Adding Cumberland Council ### Fix - #929 - Cornwall Council - #1137 - #1125 - #1106 - #1108 - #1109 - #1134 - Northumberland Council - #1082 - #1110 - Waltham Forest - #1126 - London Borough Sutton - #1131 - Kirklees Council - #1129 - Breaking Change. UPRN required ## 0.126.2 (2025-01-07) ### Fix - **tests**: updates test case url for coventry city council - **tests**: removes duplicate key for coventry city council - updates coventry city council button text ## 0.126.1 (2025-01-06) ### Fix - behave_testing - behave_testing ## 0.126.0 (2025-01-04) ### Fix - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml ## 0.125.2 (2025-01-04) ### Fix - Update ArdsAndNorthDownCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Swale Borough Council - #1080 (cherry picked from commit 6f580b39fb68b8079990221e050ae8dd6d2b7285) - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update ArdsAndNorthDownCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Update WestLindseyDistrictCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml ## 0.125.1 (2025-01-04) ### Fix - correctly handle year increment for January dates ## 0.125.0 (2025-01-04) ### Feat - Adding Redditch Borough Council - Adding Blaenau Gwent County Borough Council - Adding Wandsworth Council ### Fix - #1068 - #1098 - Wiltshire Council - #1094 - Salford City Council - #1097 - #1078 - Merton Council - Swale Borough Council - #1080 - London Borough Sutton - #1076 - Update behave_schedule.yml - Update bump.yml ## 0.124.4 (2025-01-04) ### Fix - Update behave_schedule.yml ## 0.124.3 (2025-01-04) ### Fix - allure reporting - allure reporting - allure reporting ## 0.124.2 (2025-01-03) ### Fix - Update behave.yml ## 0.124.1 (2025-01-03) ### Fix - avoid crashing on unexpected string value ## 0.124.0 (2025-01-02) ### Feat - Hart District Council ## 0.123.2 (2024-12-19) ### Fix - Update behave.yml ## 0.123.1 (2024-12-18) ### Feat - #1063 - rewrite Kirklees Council parser for new website - #1067 - Add garden bin collections where available for Norwich City Council - Adding Wandsworth Council ### Fix - Update AberdeenCityCouncil.py - Update behave.yml - #1101 - Fix table parsing for Walsall Council - Remove invalid escape sequence warnings from West Lindsey District Council - #1073 - change method of generating bin types to avoid manual mapping for Rugby Borough Council - add missing backticks to separate colour config and standard usage instructions - #1078 (cherry picked from commit 89d93666bb659010d1c130b98c1d81c6ff80cf7c) - change date format to project default for Merton Council - correct date logic for Swale Borough Council - Merton Council - London Borough Sutton - #1076 (cherry picked from commit 1eab20c9a57c9c4438ea343f374202bb2e9b98ca) - Swale Borough Council - #1080 (cherry picked from commit 6f580b39fb68b8079990221e050ae8dd6d2b7285) - correct date/year logic for West Lindsey District Council - replace West Lindsey's input with working address - #1089 - Correct shifted dates in Bromley Borough Council - remove WDM import - #1087 - Food waste date incorrect for West Berkshire Council ## 0.123.0 (2024-12-17) ## 0.122.0 (2024-12-04) ### Feat - Adding Monmouthshire County Council - Adding Hinckley and Bosworth Borough Council ### Fix - Glasgow City Council - Merton Council - Blaby District Council - Warwick District Council - Blackburn Council - Carmarthenshire County Council - High Peak Council - CarmarthenshireCountyCouncil ## 0.121.1 (2024-12-03) ### Fix - London Borough of Lewisham to have more reliable parsing of dates ## 0.121.0 (2024-11-24) ### Feat - Royal Borough of Greenwich - Adding London Borough of Lewisham - Adding Hackney Council - Adding Sandwell Borough Council - Adding Moray Council - Adding Kings Lynn and West Norfolk Borough Council - Adding Wyre Forest District Council - Adding Folkstone and Hythe District Council - Adding Cheltenham Borough Council - Adding Thurrock Council ### Fix - West Northamptonshire Council - East Ayrshire Council - Cotswold District Council ## 0.120.0 (2024-11-20) ### Feat - Adding Hartlepool Borough Council - Adding Newcastle Under Lyme Council - Adding London Borough of Havering - Add Garden collection to EnvironmentFirst - Adding Cumberland Council (Allerdale District) - Adding North Hertfordshire District Council ### Fix - #844 - #778 - #769 - #1025 - Mid Siffolk and Babergh Garden Collection Day - #1026 This will require the use of a DAY to be added to the UPRN field - #1029 - #1028 ## 0.119.0 (2024-11-20) ### Feat - Adding Braintree District Council - Adding Burnley Borough Council - Adding Exeter City Council - Adding Edinburgh City Council ### Fix - #699 - #1015 - #1017 - #894 ## 0.118.0 (2024-11-15) ### Feat - Adding Aberdeen City Council - Adding Wolverhampton City Council - Adding Stevenage Borough Council - Adding Thanet District Council - Adding Copeland Borough Council - Adding South Hams District Council ### Fix - #1019 - #966 - #989 - #1004 - #1006 - #1008 - Rother District Council ## 0.117.0 (2024-11-13) ### Feat - Adding South Staffordshire District Council fix: #885 - Adding Rother District Council ### Fix - #1009 ## 0.116.0 (2024-11-12) ### Feat - Adding Ashfield District Council - Adding Gravesham Borough Council - Adding Argyll and Bute Council ### Fix - CrawleyBoroughCouncil - #1005 - Adding Garden collection to Babergh and MidSuffolk Council - #995 - #579 - #991 - #692 - CheshireWestAndChesterCouncil - #993 - Milton Keynes - #702 - Adding Babergh and Mid Suffolk District Councils - #868 fix: #919 - Adding Derby City Council - #987 ## 0.115.0 (2024-11-11) ### Feat - Adding Warrington Borough Council - Adding Antrim And Newtonabbey Council - Adding Hertsmere Borough Council - Adding West Lancashire Borough Council - Broxbourne Council ### Fix - #695 - #969 - #776 - #980 - #982 - Bradford MDC - #984 ## 0.114.6 (2024-11-09) ### Fix - NBBC Date Fix ## 0.114.5 (2024-11-08) ### Fix - migration logging and debugging ## 0.114.4 (2024-11-08) ### Fix - migration not working - migration not working ## 0.114.3 (2024-11-08) ## 0.114.2 (2024-11-08) ## 0.114.1 (2024-11-08) ### Fix - Update manifest.json ## 0.114.0 (2024-11-07) ### Feat - Nuneaton and Bedworth Borough Council ## 0.113.0 (2024-11-07) ## 0.112.1 (2024-11-07) ## 0.112.0 (2024-11-06) ### Feat - adding calendar for Bins in Custom Component ### Fix - fix manifest in custom component - #975 adding routine to handle migration error - #975 adding routine to handle migration error - #767 BREAKING CHANGE - READD your sensors / config ## 0.111.0 (2024-11-06) ### Fix - Add London Borough of Sutton - #944 - Add Mid Devon Council - #945 - Adding Oxford City Council - #962 - Tunbridge Wells / Lincoln - #963 - Glasgow City Council ## 0.110.0 (2024-11-04) ### Fix - Adding Blaby District Council - #904 - Adding Sefton Council - #770 - Adding Bromsgrove District Council - #893 - East Lindsey District Council - #957 - Adding Carmarthenshire County Council - #892 fix: #710 - Adding East Ayrshire Council - #955 ## 0.109.2 (2024-11-03) ### Fix - CC testing and add Chesterfield ## 0.109.1 (2024-11-03) ### Fix - CC testing and add Chesterfield - CC testing and add Chesterfield ## 0.109.0 (2024-11-02) ### Feat - Adding Cotswold District Council - Adding Breckland Council ### Fix - St Helens Borough Council - #753 - NewarkAndSherwoodDC - #941 - #658 - #656 ## 0.108.2 (2024-11-01) ### Fix - pytest-homeassistant-custom-component ## 0.108.1 (2024-11-01) ### Fix - Pydandic version - Pydandic version ## 0.108.0 (2024-11-01) ### Feat - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes ## 0.107.0 (2024-10-31) ### Feat - Adding Powys Council - Adding Worcester City Council - Adding Ards and North Down Council - Adding East Herts Council - Adding Ashford Borough Council ### Fix - WestOxfordshireDistrictCouncil - South Norfolk Council - ForestOfDeanDistrictCouncil - Croydon Council - South Kesteven District Council - #647 - #630 - #623 - #586 - #578 - #389 ## 0.106.0 (2024-10-28) ### Feat - Adding Stockton On Tees Council - Adding Fife Council - Adding Flintshire County Council ### Fix - #930 - #933 - #750 ## 0.105.1 (2024-10-24) ### Fix - Refactor Midlothian Council scraper to use house number and postcode - West Berkshire Council - Southwark Council ## 0.105.0 (2024-10-21) ### Feat - Adding Teignbridge Council - Adding Harborough District Council - Adding Watford Borough Council - Adding Coventry City Council - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes - Adding Powys Council - Adding Worcester City Council - Adding Ards and North Down Council - Adding East Herts Council - Adding Ashford Borough Council - Adding Stockton On Tees Council - Adding Fife Council - Adding Flintshire County Council - Adding Teignbridge Council - Adding Harborough District Council - Adding Watford Borough Council - Adding Coventry City Council - Python 3.12 only and CustomComp. Unit testing ### Fix - #580 - #888 - #902 - #607 - CC testing and add Chesterfield - CC testing and add Chesterfield - CC testing and add Chesterfield - pytest-homeassistant-custom-component - Pydandic version - Pydandic version - WestOxfordshireDistrictCouncil - South Norfolk Council - ForestOfDeanDistrictCouncil - Croydon Council - South Kesteven District Council - #647 - #630 - #623 - #586 - #578 - #389 - #930 - #933 - #750 - Refactor Midlothian Council scraper to use house number and postcode - West Berkshire Council - Southwark Council - #580 - #888 - #902 - #607 ## 0.104.0 (2024-10-20) ### Feat - Adding Luton Borough Council - Adding West Oxfordshire District Council - Adding Aberdeenshire Council - Adding Canterbury City Council - Adding Swindon Borough Council ### Fix - #697 - #694 - #659 - #590 - #900 ## 0.103.0 (2024-10-20) ### Feat - Adding RAW JSON Sensor ### Fix - Black formatting - Black formatting ## 0.102.0 (2024-10-20) ### Feat - Moving from Attributes to Sensors - Moving from Attributes to Sensors ## 0.101.0 (2024-10-20) ### Feat - Add Midlothgian Council ## 0.100.0 (2024-10-18) ### Feat - Adding Dudley Council - Adding South Ribble Council - Plymouth Council - Adding Norwich City Council ### Fix - #744 - #671 - #566 - #749 ## 0.99.1 (2024-10-16) ### Fix - #792 adding web_driver option to Wokingham Council ## 0.99.0 (2024-10-16) ### Feat - Adding Lincoln Council - Adding Tunbridge Wells Council - Adding Perth and Kinross Council ### Fix - Update wiki - #748 - #598 - #572 ## 0.98.5 (2024-10-15) ### Fix - Swale Borough Council - HaltonBoroughCouncil - Barnet Council - WestBerkshireCouncil ## 0.98.4 (2024-10-14) ### Fix - West Suffolk Council - Vale of White Horse Council - Uttlesford District Council - Neath Port Talbot Council - Merton Council - Manchester City Council - Glasgow City Council - BradfordMDC ## 0.98.3 (2024-10-13) ### Fix - EastRiding ## 0.98.2 (2024-10-13) ### Fix - MoleValley ## 0.98.1 (2024-10-13) ### Fix - Barnet and Bexley ## 0.98.0 (2024-10-13) ### Feat - Adding Wirral Council - Adding Lichfield District Council - Adding West Morland And Furness - Adding Walsall Council - Adding Armagh, Banbridge and Craigavon Council ### Fix - #602 - #830 - #870 - #873 - #877 ## 0.97.1 (2024-10-10) ### Fix - NottinghamCityCouncil - #875 ## 0.97.0 (2024-10-10) ### Feat - Adding Falkirk Council ### Fix - #761 ## 0.96.0 (2024-10-10) ### Feat - Adding London Borough Harrow - Adding North Ayrshire Council - Adding Highland Council - Add Elmbridge Borough Council - Adding Southwark Council - South Derbyshire District Council ### Fix - #871 - #869 - #780 - #845 fix: #754 - #835 - #842 ## 0.95.0 (2024-10-09) ### Feat - Adding London Borough of Ealing ## 0.94.0 (2024-10-09) ### Feat - Adding London Borough of Lambeth - Adding Dacorum Borough Council ### Fix - Dacorum Borough Council - East Devon DC ## 0.93.0 (2024-10-08) ### Feat - Update CheshireEastCouncil.py ## 0.92.0 (2024-10-08) ### Feat - Update CheshireEastCouncil.py - Update README.md - Adding Wokingham Borough Council - Adding Winchester City Council - Adding Basildon Council - Adding Colchester City Council ### Fix - RochfordCouncil - Neath Port Talbot Council - Buckinghamshire Council - #639 fix: #812 ## 0.91.2 (2024-10-05) ### Fix - Windsor and Maidenhead Council ## 0.91.1 (2024-10-04) ## 0.91.0 (2024-10-03) ## 0.90.0 (2024-10-03) ### Feat - Adding East Renfrewshire Council ### Fix - Update DorsetCouncil.py - #829 - Update GatesheadCouncil.py - #822 ## 0.89.1 (2024-10-02) ### Fix - High Peak have changed their cookie dialog Seems to be safe to ignore it now. ## 0.89.0 (2024-09-27) ### Feat - Update CheshireEastCouncil.py - Update README.md ### Fix - release to be non pre release ## 0.88.0 (2024-09-16) ### Feat - Add Ealing Council ### Fix - Update README.md ## 0.87.0 (2024-09-10) ### Feat - Add IslingtonCouncil ### Fix - #565 Gloucester city council driver ## 0.86.1 (2024-09-09) ### Fix - #773 Wakefield ## 0.86.0 (2024-09-06) ### Feat - added Rotherham Council ## 0.85.7 (2024-09-05) ### Fix - more unit tests - more unit tests - Chorley ## 0.85.6 (2024-09-03) ### Fix - #795 and add reconfigure to custom comp. ## 0.85.5 (2024-09-03) ## 0.85.4 (2024-09-03) ### Fix - #795 and add reconfigure to custom comp. - #795 Unit Test Coverage ## 0.85.3 (2024-09-02) ### Fix - #795 unit test coverage ## 0.85.2 (2024-09-02) ### Fix - 791 Glasgow URL change ## 0.85.1 (2024-09-02) ### Fix - 779 Add correct async wait to Home Assistant ## 0.85.0 (2024-08-27) ### Feat - support for enfield council ## 0.84.2 (2024-08-27) ### Fix - Re-work North Tyneside Council module for 2024 - some addresses do not have a garden collection - Re-work North Tyneside Council module for 2024 ## 0.84.1 (2024-08-08) ### Fix - #771 Bolton bullet points on dates is now fixed ## 0.84.0 (2024-07-31) ## 0.83.0 (2024-07-07) ### Feat - add has_numbers() function ### Fix - update Gedling Borough Council parser to use alternative name key - change Gedling to use new JSON data - update instructions for Gedling - update input.json to use UPRN parameter - change DorsetCouncil.py to use API links provided in #756 - explicit import of logging.config to stop error in Python 3.11 ## 0.82.0 (2024-06-13) ### Feat - adding dev container updates - adding dev container updates - refactoring main files - adding ability to set local mode in HA custom comp. if users dont have a Selenium Server ### Fix - MidSussex ## 0.81.0 (2024-06-05) ### Feat - Adding Wychavon District Council ### Fix - IntTestWarnings - IntTestWarnings ## 0.80.0 (2024-06-02) ### Feat - Adding Uttlesford District Council - Adding Stafford Boro Council - Adding Swansea Council - Adding New Forest - Adding Three Rivers - Adding Three Rivers ### Fix - ThreeRivers - #425 Entities are not updated - sessions to avoid deprecation - Update docker-image.yml - Update docker-image.yml ## 0.79.1 (2024-05-29) ### Fix - Change CSS class in search for collection types ## 0.79.0 (2024-05-28) ### Feat - Adding Dartford - Adding South Kesteven District Council - Adding ChichesterCouncil - adding HounslowCouncil - adding HounslowCouncil - adding HounslowCouncil - Epping Fix - Adding Epping Forest District Council - Update input.json - Epping Forest District Council - Adding Stroud District Council - Add support for Tendring District Council - #269 Adding Waltham Forest - #269 Adding Waltham Forest - Adding council creation script ### Fix - Update Mole Valley URL ## 0.78.0 (2024-05-26) ### Feat - Add support for Fareham Borough Council ## 0.77.0 (2024-05-26) ### Feat - Add support for Bracknell Forest Council ## 0.76.1 (2024-05-24) ### Fix - Handle Barnet council cookies message ## 0.76.0 (2024-05-24) ### Feat - add bin colour support WestSuffolkCouncil style: black format WestSuffolkCouncil - add bin colour support WestSuffolkCouncil style: black format WestSuffolkCouncil ## 0.75.0 (2024-05-19) ### Feat - #725 Add names to selenium test videos using "se:name" option in create webdriver function ## 0.74.1 (2024-05-18) ### Fix - #693 Cheshire West & Chester Council Sensor Bug - #693 Cheshire West & Chester Council Sensor Bug ## 0.74.0 (2024-05-17) ### Feat - #722 Support Python 3.12 - #722 Support Python 3.12 - #722 Support Python 3.12 ## 0.73.0 (2024-05-17) ### Feat - #708 Adding HA to the dev container for debugging ## 0.72.0 (2024-05-17) ### Feat - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging ## 0.71.0 (2024-05-17) ### Feat - Update for West Suffolk Councils new website ## 0.70.0 (2024-05-17) ### Feat - #708 Dev Container - Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 simplifying Selenium integration tests - #708 simplifying Selenium integration tests - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Dev Container testing - #708 - dev container changes - #706 Adding Dev Container - #706 Adding initial Dev Container ## 0.69.7 (2024-05-17) ### Fix - #713 BarnsleyMBCouncil.py ## 0.69.6 (2024-05-16) ### Fix - #709 Update DoverDistrictCouncil.py ## 0.69.5 (2024-05-14) ### Fix - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 test coverage back to 100% ## 0.69.4 (2024-05-09) ### Fix - pass in required parameter into `create_webdriver` - test runners for `MiltonKeynesCityCouncil` and `NorthEastLincs`. ## 0.69.3 (2024-05-09) ### Fix - fix AttributeError when no garden waste collection is available for properties using Huntingdon District Council - add support for parsing "Today" / "Tomorrow" as date text for `BarnsleyMBCouncil` - add support for parsing "Tomorrow" as date text for `LiverpoolCityCouncil` ## 0.69.1 (2024-05-01) ### Fix - Handling the "Website cookies enhance your user experience." button - Handling the "Website cookies enhance your user experience." button ## 0.69.0 (2024-04-28) ### Feat - Adding Renfrewshire Council - Adding Renfrewshire Council ## 0.68.2 (2024-04-28) ### Fix - Remove 'import Dumper' ## 0.68.1 (2024-04-27) ### Fix - input.json Bradford missing comma ## 0.68.0 (2024-04-27) ### Feat - Add support for West Berkshire Council - add support for Knowsley Metropolitan Borough Council - add support for Cheshire West and Chester Council - add support for Cheshire West and Chester Council ## 0.66.2 (2024-04-18) ### Fix - Update HaringeyCouncil.py issue #670 ## 0.66.1 (2024-04-15) ### Fix - parse datetimes correctly and round to midnight ## 0.66.0 (2024-04-15) ## 0.65.2 (2024-04-15) ### Fix - change address selection to fix errors selecting the user's PAON ## 0.65.1 (2024-04-15) ### Fix - add check for parsed string length to stop datetime parsing error ## 0.65.0 (2024-04-13) ### Feat - add Arun council - add support for Sunderland City Council - add support for Sunderland City Council ## 0.64.3 (2024-03-25) ### Fix - sort data and correct dictionary name (#609) ## 0.64.2 (2024-03-24) ## 0.64.1 (2024-03-24) ### Fix - fix Kirklees address search (switch to house & postcode) - fixes json ## 0.64.0 (2024-03-23) ### Feat - add Kirklees council ### Fix - fixes json ## 0.63.0 (2024-03-23) ### Feat - Add Solihull Council (#513) - Add Adur and Worthing Councils (#544) - Add Dover District Council (#614) - Add Rochford Council (#620) - Add Tandridge District Council (#621) - Add West Northamptonshire Council (#567) - Add Hull City Council (#622) - Add Wyre Council (#625) - Add Telford and Wrekin Co-operative Council (#632) - Add Mansfield District Council (#560) - Add Bedford Borough Council (#552) ### Fix - spacing on input.json - realign input.json - capitalize bin type text - formatting on input.json - incorrect collections - update testing URL for Merton - attempt to resolve invisible banner hiding postcode box - resolve JSON schema exception for date formatting - resolve JSON schema exception for date formatting - accept cookies banner ## 0.62.0 (2024-03-03) ### Fix - Added missing .feature file entry to the test config for NewhamCouncil ## 0.61.1 (2024-02-16) ### Fix - code optimisations - Fix date parsing in WestLindseyDistrictCouncil.py ## 0.61.0 (2024-02-11) ### Feat - Add Mole Valley District Council ## 0.60.1 (2024-02-03) ### Fix - Update input.json Closes #599 ## 0.60.0 (2024-01-28) ### Feat - Add Scraper for St Albans City and District Council ## 0.59.1 (2024-01-25) ### Fix - add wiki note for castlepoint - update test data for castlepoint - remove single line causing issues ## 0.59.0 (2024-01-20) ### Feat - Add NorthYorkshire to test feature file - Add north yorkshire to test input - Add Support for north yorkshire council ### Fix - remove unused code ## 0.58.8 (2024-01-19) ### Fix - barnet no overrides ## 0.58.7 (2024-01-18) ### Fix - accidentally returned strings when needed date objects, refactor to handle this - checking for future/past dates ## 0.58.6 (2024-01-18) ### Fix - correct date handling for North West Leicestershire ## 0.58.5 (2024-01-15) ### Fix - Don't call driver.quit where already handled by finally block ## 0.58.4 (2024-01-15) ### Fix - remove extra driver.quit to prevent errors ## 0.58.3 (2024-01-15) ### Feat - Added support for Newham Council's bin collections ### Fix - Add a default value for user_agent to fix all councils using selenium and not specifying agent ## 0.58.2 (2024-01-11) ### Fix - use static values for bin types ## 0.58.1 (2024-01-10) ### Fix - Eastleigh Borough Council doesnt cope with "You haven't yet signed up for ..." - Eastleigh Borough Council doesnt cope when Garden Waste service hasn't been signed up for, which gets the value "You haven't yet signed up for our garden waste collections. Find out more about our\xa0garden waste collection service" which results in ValueError: time data ## 0.58.0 (2024-01-10) ### Feat - Add Test Valley Borough Council ## 0.57.0 (2024-01-09) ### Feat - Add support for Chorley Council ## 0.56.13 (2024-01-09) ### Fix - update logic to account for council website change ## 0.56.12 (2024-01-09) ### Fix - duplicate driver.quit() calls causes error ## 0.56.11 (2024-01-08) ### Fix - Headless now working on custom comp Update sensor.py ## 0.56.10 (2024-01-08) ### Fix - headless mode in custom component ## 0.56.9 (2024-01-08) ### Fix - headless mode ## 0.56.8 (2024-01-08) ### Fix - headless in custom comp ## 0.56.7 (2024-01-08) ### Fix - headless options ## 0.56.6 (2024-01-07) ### Fix - modified Kingston-upon-Thames driver for greater reliability. ## 0.56.5 (2024-01-07) ### Fix - Update KingstonUponThamesCouncil.py ## 0.56.4 (2024-01-07) ### Fix - Update KingstonUponThamesCouncil.py ## 0.56.3 (2024-01-07) ### Fix - headless options - #542 - Selenium Grid Sessions must be terminated cleanly - #542 - Selenium Grid Sessions must be terminated cleanly ## 0.56.2 (2024-01-07) ### Fix - Update strings.json - Update en.json - Update config_flow.py ## 0.56.1 (2024-01-07) ### Fix - Update common.py ## 0.156.0 (2025-10-11) ### Feat - Create tag-on-merge.yml - Update bump.yml - fix bump.yml - Update TorbayCouncil.py - Update bump.yml - fix release pipeline bump.yml - fix Torbay ### Fix - Update AberdeenCityCouncil.py - Update TorbayCouncil.py - Update TorbayCouncil.py - Update TorbayCouncil.py - Update URL for NewForestCouncil - New URL and page for wheelie bins - improve Mid Suffolk District Council holiday handling with dynamic bank holiday detection - Oxford now rejects the "Requests" default user agent - #1557 - Adding East Dunbartonshire - #1557 - Adding East Dunbartonshire - #1569 - Somerset Council - #1569 - Somerset Council - #1559 - Newport City Council - #1559 - Newport City Council - #1574 - Test Valley Borough Council - #1574 - Test Valley Borough Council - #1566 South Gloucestershire Council ## 0.154.0 (2025-09-21) ### Feat - handle changes to northumberland council website - modify input for NorthumberlandCouncil to accept uprn instead of house number, and use new page structure ### Fix - the cookie banner is not optional - #1570 - Slough Borough Council - #1570 - Slough Borough Council - #1520 - Erewash Borough Council - #1520 - Erewash Borough Council - #1554 - Folkestone and Hythe District Council - #1554 - Folkestone and Hythe District Council - #1604 - West Berkshire Council - #1604 - West Berkshire Council - #1606 - Brighton and Hove City Council - #1606 - Brighton and Hove City Council - #1565 - BCP Council - #1565 - BCP Council - #1571 - Castle Point District Council - #1571 - Castle Point District Council - #1584 - NorthHertfordshireDistrictCouncil - #1584 - NorthHertfordshireDistrictCouncil - #1599 - #1599 - Basingstoke Council - #1587 - #1587 - Hartlepool Borough Council - #1588 - #1588 Glasgow City Council - #1591 - #1591 Rushmoor Council ## 0.153.0 (2025-09-02) ### Feat - Change buckinghamshire council to get data from endpoint ### Fix - 1573 Update Bolton council URL - East Herts Council - #1575 - Runnymede Borough Council - #1513 - Wiltshire Council - #1533 - Staffordshire Moorlands District Council - #1535 - Ipswich Borough Council - #1548 - North East Lincs - Hinckley and Bosworth Borough Council - Nuneaton Bedworth Borough Council - #1514 - Lichfield District Council - 1549 ## 0.152.11 (2025-08-25) ### Feat - fix releases process ### Fix - date extraction in RochfordCouncil data parsing - parsing error in BH selenium - **hacs**: respect the headless option ### Refactor - **hacs**: improve build_ukbcd_args with formatter functions ## 0.152.10 (2025-08-04) ### Fix - Gateshead and East Lothian - Enfield and Broxbourne - East Herts - FermanaghOmaghDistrictCouncil - Enfield and Broxbourne - East Herts ## 0.152.9 (2025-08-03) ### Fix - Cotswald and coventry - Fixing multiple broken councils - multiple broken councils ## 0.152.8 (2025-07-26) ### Fix - Add headers to request for Swindon Borough Council - Add headers to requests for Royal Borough of Greenwich Fixes #1496 by ensuring that the requests are not rejected due to lack of headers. - **MidlothianCouncil**: add request headers to resolve 403 Forbidden ## 0.152.7 (2025-07-01) ### Fix - maidstone selenium fix ## 0.152.6 (2025-06-18) ### Fix - removed In Progress from date - removed a degub print statement - **RugbyBoroughCouncil**: Amended parsed date from full to abbreviated month date, may worked but jun and jul did not - **RugbyBoroughCouncil**: Amended parsed date - Reworked Cumberland Council to cater for postcode addition - **OxfordCityCouncil**: Fixed Oxford City Council parsing dues to changes in output from the website ## 0.152.5 (2025-06-07) ### Fix - South Ribble and version pinning issues for input.json ## 0.152.4 (2025-06-07) ### Fix - **SouthRibble**: Corrected Date formatting issue - **SouthRibble**: Resolved South Ribble without selenium ## 0.152.3 (2025-06-04) ### Fix - NorthHertfordshire selenium script - Adur council - Eastleigh date fix - removed duplicates in BradfordMDC ## 0.152.2 (2025-06-04) ### Fix - Update Makefile - Update CheshireEastCouncil.py - Github action to handle branch name with parentheses ## 0.152.1 (2025-05-15) ### Fix - Update to fix North Somerset - Glasgow SSL bypass - more robust Northumberland - updated Eastleigh input.json - Eastleigh cloudflare fix - converted collection datetimes into dates for BH parsing. - Eastleigh cloudflare fix - Eastleigh cloudflare fix - added check_uprn to simplified councils - simplified Swindon - simplified East Devon - simplified Dover - Simplified Dartford - simplified Cheshire East - simplified Charnwood input.json - improved Charnwood - Adur Worthing fix - Chorley simplification - Bexley simplification - added URL to Torbay script - Guildford fixes - reworked Maidstone - maidstone input.json - Croydon selenium version - Stoke date-time fix ## 0.152.0 (2025-05-02) ### Feat - Added Fermanagh Omagh - Added Twekesbury - added Slough council - Added Argus Council - added Angus to input.json ### Fix - Chichester now only requires postcode and house number - Broadland now only requires postcode and house number - Barking now only requires postcode and house number - Brighton now only requires postcode and house number - ensured all bins for this council - added skip_get_url to hyndburn ## 0.151.0 (2025-04-27) ### Feat - version bump ### Fix - more robust brent date handling - input.json requires web_driver - Rugby fix ## 0.150.0 (2025-04-27) ### Feat - added melton - added pembrokeshire ### Fix - added melton - processed all bins for Moray ## 0.148.6 (2025-04-27) ## 0.148.5 (2025-04-27) ### Fix - output check - parsed bin info - selenium navigation - input.json changes ## 0.148.4 (2025-04-27) ### Fix - used canonical 'nice name' ## 0.148.3 (2025-04-25) ### Fix - working hyndburn - hyndburn input.json ## 0.148.2 (2025-04-24) ### Fix - Update docker-compose.yml - updated input.json - cloudflare fix - switch to selenium method - simplified blackburn ## 0.148.1 (2025-04-22) ### Fix - added bank holiday offsets. - added bank holiday offsets. ## 0.148.0 (2025-04-19) ### Feat - adding Wrexham and #1046 Horsham councils ### Fix - Argyll and Bute council #1053 ## 0.147.2 (2025-04-18) ### Fix - wait for element to be clickable ## 0.147.1 (2025-04-18) ### Fix - #1351 - moved geopandas to petry dev ## 0.147.0 (2025-04-18) ### Feat - add council tests results map ## 0.146.2 (2025-04-18) ### Fix - adding map checking and matching ## 0.146.1 (2025-04-18) ### Fix - more robust bank holiday handling ## 0.146.0 (2025-04-18) ### Feat - #1342 Adding Includes Trafford, Clackmannanshire, Havant, North Warwickshire, Newry Mourne and Down, East Dunbartonshire, Pendle, Torfaen, East Hampshire, Ribble Valley, Brentwood, Isle of Wight, Westmorland and Furness, Derry and Strabane, and Norwich. Google Cal support for PDF councils via ICS file ### Fix - Black reformatting ## 0.145.0 (2025-04-18) ### Feat - Adding PDF councils ## 0.144.4 (2025-04-18) ### Fix - Bristol #1275 ## 0.144.3 (2025-04-17) ### Fix - better address for input.json - bank holiday overrides - more robust address searching - simple parsing done - Selenium navigation ## 0.144.2 (2025-04-17) ### Fix - knowsley - knowsley - knowsley - knowsley - KnowsleyMBCouncil.py - #1220 adding Mid Ulster District Council ## 0.144.1 (2025-04-17) ### Fix - fix Sandwell garden waste collection date ## 0.144.0 (2025-04-17) ### Feat - added great yarmouth ## 0.143.6 (2025-04-17) ### Fix - Renfrewshire Council ## 0.143.5 (2025-04-17) ### Fix - Google Cal ## 0.143.4 (2025-04-17) ### Fix - Google Cal ## 0.143.3 (2025-04-15) ### Fix - #1301 Fix Leeds Council ## 0.143.2 (2025-04-15) ### Fix - #1301 Fix Leeds Council ## 0.143.1 (2025-04-15) ### Fix - Set the bin_type when different day ## 0.143.0 (2025-04-13) ### Fix - corrected url in input.json - fixed input.json - parsed Barking Dagenham collection information - selenium navigation Barking ## 0.142.0 (2025-04-13) ### Feat - Added Stirling Council ### Fix - typo in input.json ## 0.141.4 (2025-04-13) ### Fix - #1304 - sesnors goes to unknown if the data is blank from councils who are less reliable ## 0.141.3 (2025-04-13) ### Fix - Newham council ## 0.141.2 (2025-04-13) ### Fix - Newham council - Newham council ## 0.141.1 (2025-04-12) ### Fix - missing finally block on selenium tests ## 0.141.0 (2025-04-12) ### Feat - #1185 Adding PeterboroughCity Council ## 0.140.0 (2025-04-11) ### Feat - Added Broadland District Council ### Fix - cleanup of council file - added Broadland to input.json ## 0.139.0 (2025-04-07) ### Feat - adding #1037 - adding #1032 North Devon Count Council ### Fix - #1296 Forest of dean - 939 adding South Holland District Council - Lincolnshire UK ## 0.138.1 (2025-04-05) ### Fix - Walhtam forest council - revert previous changes ## 0.138.0 (2025-04-05) ### Feat - Adding Hastings Borough Council - Adding Fylde Council ### Fix - #1249 - #1039 fix: #1181 fix: #1266 fix: #1274 - Gloucester City Council - #1282 - Mid Devon Council - #1277 fix: #1287 - West Oxfordshire Council - #1290 ## 0.137.0 (2025-04-05) ### Feat - #816 adding trafford council ## 0.136.0 (2025-03-24) ### Feat - Adding Southampton City Council - Adding Cambridge City Council - Adding Spelthorne Borough Council ### Fix - #1057 - #1264 - #1270 - Bexley Council - #1256 - HinckleyandBosworthBoroughCouncil - #1207 - Hackney Council - #1230 - Castlepoint District Council - #1252 - Canterbury City Council - #1254 ## 0.135.4 (2025-03-24) ### Fix - parse scheduleCodeWorkflowIDs instead of scheduleCodeWorkflowID for Hackney Council ## 0.135.3 (2025-02-23) ## 0.135.2 (2025-02-19) ### Fix - North Yorkshire - multiple bins on a day ## 0.135.1 (2025-02-18) ### Fix - devcontainer ## 0.135.0 (2025-02-17) ### Fix - #833 adding Middlesbrough and check script for Selenium - Cotswold District Council - #1238 - Leeds City Council - #1222 ## 0.134.3 (2025-02-15) ### Fix - Update input.json - 1235 Councils missing Selenium in input.json ## 0.134.2 (2025-02-15) ### Fix - 1232 East herts missing Selenium url in input.json - Derbyshire Dales District Council - Conwy County Borough - Sunderland City Council - #1219 - Tendring District Council - #1221 ## 0.134.1 (2025-02-11) ### Fix - Cheltenham Borough Council - #1061 ## 0.134.0 (2025-02-07) ### Feat - Ipswich Borough Council - trying different address - Ipswich Borough Council - correcting param name in input.json - Ipswich Borough Council - added input.json values and refactored code - Ipswich Borough Council - initial implementation - Adding Runnymede Borough Council - Adding Cherwell District Council - Adding Epsom and Ewell Borough Council - Adding Redcar and Cleveland Council - Adding Amber Valley Borough Council - Adding Bolsover Council ### Fix - #1214 - #923 - #895 - #841 - #903 - #990 - Torridge District Council - #1204 - Neath Port Talbot - #1213 ## 0.133.0 (2025-02-02) ### Feat - adding manual refresh ## 0.132.0 (2025-02-02) ### Feat - adding manual refresh ## 0.131.0 (2025-02-02) ### Feat - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding unit tests for the new manual refresh - adding manual refresh control ## 0.130.1 (2025-01-30) ### Fix - slow councils ## 0.130.0 (2025-01-29) ### Feat - Add Herefordshire Council (closes: #1011) ### Fix - Fix spacing in wiki name ## 0.129.0 (2025-01-29) ### Fix - input.json - input.json ## 0.128.6 (2025-01-29) ### Fix - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting ## 0.128.5 (2025-01-29) ### Feat - Adding East Staffordshire Borough Council ### Fix - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update CheshireEastCouncil.py - Adding East Lothian Council - #1171 - #1052 fix: #1083 ## 0.128.4 (2025-01-28) ### Feat - Adding Boston Borough Council ### Fix - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Leicester City Council - #1178 - Cardiff Council - #1175 - Newcastle City Council - #1179 - #1180 - Midlothian Council - #1192 - Adding Next Page support ## 0.128.3 (2025-01-28) ### Fix - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml ## 0.128.2 (2025-01-28) ### Fix - Add communal recycling and communal rubbish - Add garden waste to Merton Council ## 0.128.1 (2025-01-28) ### Fix - Update AberdeenshireCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml ## 0.128.0 (2025-01-28) ### Feat - implement Medway Council (#1021) ### Fix - Forgot to include skip_get_url ## 0.127.4 (2025-01-25) ### Fix - NewForestCouncil ## 0.127.3 (2025-01-16) ### Fix - Swale Borough Council - #1139 - Vale of White Horse - #1156 - South Oxfordshire Council - #1158 - Surrey Heath Borough Council - #1164 - Carmarthenshire County Council - #1167 - Glasgow City Council - #1166 ## 0.127.2 (2025-01-13) ### Fix - Update bin type to be the full string ## 0.127.1 (2025-01-10) ### Fix - Use visibility of list rather than existence - Update Rushcliffe Borough Council input elements and flow - Merton Council - NewarkAndSherwoodDC - Rushcliffe Borough Council - Powys Council - Staffordshire Moorlands District Council - Stroud District Council - Vale of Glamorgan Council - West Oxfordshire District Council ## 0.127.0 (2025-01-07) ### Feat - Adding Oadby And Wigston Borough Council - Add Gwynedd Council - Adding Denbighshire Council - Adding Dundee City Council - Adding Brent Council - Adding West Dunbartonshire Council - Adding Cumberland Council ### Fix - #929 - Cornwall Council - #1137 - #1125 - #1106 - #1108 - #1109 - #1134 - Northumberland Council - #1082 - #1110 - Waltham Forest - #1126 - London Borough Sutton - #1131 - Kirklees Council - #1129 - Breaking Change. UPRN required ## 0.126.2 (2025-01-07) ### Fix - **tests**: updates test case url for coventry city council - **tests**: removes duplicate key for coventry city council - updates coventry city council button text ## 0.126.1 (2025-01-06) ### Fix - behave_testing - behave_testing ## 0.126.0 (2025-01-04) ### Fix - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml ## 0.125.2 (2025-01-04) ### Fix - Update ArdsAndNorthDownCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Swale Borough Council - #1080 (cherry picked from commit 6f580b39fb68b8079990221e050ae8dd6d2b7285) - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update ArdsAndNorthDownCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Update WestLindseyDistrictCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml ## 0.125.1 (2025-01-04) ### Fix - correctly handle year increment for January dates ## 0.125.0 (2025-01-04) ### Feat - Adding Redditch Borough Council - Adding Blaenau Gwent County Borough Council - Adding Wandsworth Council ### Fix - #1068 - #1098 - Wiltshire Council - #1094 - Salford City Council - #1097 - #1078 - Merton Council - Swale Borough Council - #1080 - London Borough Sutton - #1076 - Update behave_schedule.yml - Update bump.yml ## 0.124.4 (2025-01-04) ### Fix - Update behave_schedule.yml ## 0.124.3 (2025-01-04) ### Fix - allure reporting - allure reporting - allure reporting ## 0.124.2 (2025-01-03) ### Fix - Update behave.yml ## 0.124.1 (2025-01-03) ### Fix - avoid crashing on unexpected string value ## 0.124.0 (2025-01-02) ### Feat - Hart District Council ## 0.123.2 (2024-12-19) ### Fix - Update behave.yml ## 0.123.1 (2024-12-18) ### Feat - #1063 - rewrite Kirklees Council parser for new website - #1067 - Add garden bin collections where available for Norwich City Council - Adding Wandsworth Council ### Fix - Update AberdeenCityCouncil.py - Update behave.yml - #1101 - Fix table parsing for Walsall Council - Remove invalid escape sequence warnings from West Lindsey District Council - #1073 - change method of generating bin types to avoid manual mapping for Rugby Borough Council - add missing backticks to separate colour config and standard usage instructions - #1078 (cherry picked from commit 89d93666bb659010d1c130b98c1d81c6ff80cf7c) - change date format to project default for Merton Council - correct date logic for Swale Borough Council - Merton Council - London Borough Sutton - #1076 (cherry picked from commit 1eab20c9a57c9c4438ea343f374202bb2e9b98ca) - Swale Borough Council - #1080 (cherry picked from commit 6f580b39fb68b8079990221e050ae8dd6d2b7285) - correct date/year logic for West Lindsey District Council - replace West Lindsey's input with working address - #1089 - Correct shifted dates in Bromley Borough Council - remove WDM import - #1087 - Food waste date incorrect for West Berkshire Council ## 0.123.0 (2024-12-17) ## 0.122.0 (2024-12-04) ### Feat - Adding Monmouthshire County Council - Adding Hinckley and Bosworth Borough Council ### Fix - Glasgow City Council - Merton Council - Blaby District Council - Warwick District Council - Blackburn Council - Carmarthenshire County Council - High Peak Council - CarmarthenshireCountyCouncil ## 0.121.1 (2024-12-03) ### Fix - London Borough of Lewisham to have more reliable parsing of dates ## 0.121.0 (2024-11-24) ### Feat - Royal Borough of Greenwich - Adding London Borough of Lewisham - Adding Hackney Council - Adding Sandwell Borough Council - Adding Moray Council - Adding Kings Lynn and West Norfolk Borough Council - Adding Wyre Forest District Council - Adding Folkstone and Hythe District Council - Adding Cheltenham Borough Council - Adding Thurrock Council ### Fix - West Northamptonshire Council - East Ayrshire Council - Cotswold District Council ## 0.120.0 (2024-11-20) ### Feat - Adding Hartlepool Borough Council - Adding Newcastle Under Lyme Council - Adding London Borough of Havering - Add Garden collection to EnvironmentFirst - Adding Cumberland Council (Allerdale District) - Adding North Hertfordshire District Council ### Fix - #844 - #778 - #769 - #1025 - Mid Siffolk and Babergh Garden Collection Day - #1026 This will require the use of a DAY to be added to the UPRN field - #1029 - #1028 ## 0.119.0 (2024-11-20) ### Feat - Adding Braintree District Council - Adding Burnley Borough Council - Adding Exeter City Council - Adding Edinburgh City Council ### Fix - #699 - #1015 - #1017 - #894 ## 0.118.0 (2024-11-15) ### Feat - Adding Aberdeen City Council - Adding Wolverhampton City Council - Adding Stevenage Borough Council - Adding Thanet District Council - Adding Copeland Borough Council - Adding South Hams District Council ### Fix - #1019 - #966 - #989 - #1004 - #1006 - #1008 - Rother District Council ## 0.117.0 (2024-11-13) ### Feat - Adding South Staffordshire District Council fix: #885 - Adding Rother District Council ### Fix - #1009 ## 0.116.0 (2024-11-12) ### Feat - Adding Ashfield District Council - Adding Gravesham Borough Council - Adding Argyll and Bute Council ### Fix - CrawleyBoroughCouncil - #1005 - Adding Garden collection to Babergh and MidSuffolk Council - #995 - #579 - #991 - #692 - CheshireWestAndChesterCouncil - #993 - Milton Keynes - #702 - Adding Babergh and Mid Suffolk District Councils - #868 fix: #919 - Adding Derby City Council - #987 ## 0.115.0 (2024-11-11) ### Feat - Adding Warrington Borough Council - Adding Antrim And Newtonabbey Council - Adding Hertsmere Borough Council - Adding West Lancashire Borough Council - Broxbourne Council ### Fix - #695 - #969 - #776 - #980 - #982 - Bradford MDC - #984 ## 0.114.6 (2024-11-09) ### Fix - NBBC Date Fix ## 0.114.5 (2024-11-08) ### Fix - migration logging and debugging ## 0.114.4 (2024-11-08) ### Fix - migration not working - migration not working ## 0.114.3 (2024-11-08) ## 0.114.2 (2024-11-08) ## 0.114.1 (2024-11-08) ### Fix - Update manifest.json ## 0.114.0 (2024-11-07) ### Feat - Nuneaton and Bedworth Borough Council ## 0.113.0 (2024-11-07) ## 0.112.1 (2024-11-07) ## 0.112.0 (2024-11-06) ### Feat - adding calendar for Bins in Custom Component ### Fix - fix manifest in custom component - #975 adding routine to handle migration error - #975 adding routine to handle migration error - #767 BREAKING CHANGE - READD your sensors / config ## 0.111.0 (2024-11-06) ### Fix - Add London Borough of Sutton - #944 - Add Mid Devon Council - #945 - Adding Oxford City Council - #962 - Tunbridge Wells / Lincoln - #963 - Glasgow City Council ## 0.110.0 (2024-11-04) ### Fix - Adding Blaby District Council - #904 - Adding Sefton Council - #770 - Adding Bromsgrove District Council - #893 - East Lindsey District Council - #957 - Adding Carmarthenshire County Council - #892 fix: #710 - Adding East Ayrshire Council - #955 ## 0.109.2 (2024-11-03) ### Fix - CC testing and add Chesterfield ## 0.109.1 (2024-11-03) ### Fix - CC testing and add Chesterfield - CC testing and add Chesterfield ## 0.109.0 (2024-11-02) ### Feat - Adding Cotswold District Council - Adding Breckland Council ### Fix - St Helens Borough Council - #753 - NewarkAndSherwoodDC - #941 - #658 - #656 ## 0.108.2 (2024-11-01) ### Fix - pytest-homeassistant-custom-component ## 0.108.1 (2024-11-01) ### Fix - Pydandic version - Pydandic version ## 0.108.0 (2024-11-01) ### Feat - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes ## 0.107.0 (2024-10-31) ### Feat - Adding Powys Council - Adding Worcester City Council - Adding Ards and North Down Council - Adding East Herts Council - Adding Ashford Borough Council ### Fix - WestOxfordshireDistrictCouncil - South Norfolk Council - ForestOfDeanDistrictCouncil - Croydon Council - South Kesteven District Council - #647 - #630 - #623 - #586 - #578 - #389 ## 0.106.0 (2024-10-28) ### Feat - Adding Stockton On Tees Council - Adding Fife Council - Adding Flintshire County Council ### Fix - #930 - #933 - #750 ## 0.105.1 (2024-10-24) ### Fix - Refactor Midlothian Council scraper to use house number and postcode - West Berkshire Council - Southwark Council ## 0.105.0 (2024-10-21) ### Feat - Adding Teignbridge Council - Adding Harborough District Council - Adding Watford Borough Council - Adding Coventry City Council - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes - Adding Powys Council - Adding Worcester City Council - Adding Ards and North Down Council - Adding East Herts Council - Adding Ashford Borough Council - Adding Stockton On Tees Council - Adding Fife Council - Adding Flintshire County Council - Adding Teignbridge Council - Adding Harborough District Council - Adding Watford Borough Council - Adding Coventry City Council - Python 3.12 only and CustomComp. Unit testing ### Fix - #580 - #888 - #902 - #607 - CC testing and add Chesterfield - CC testing and add Chesterfield - CC testing and add Chesterfield - pytest-homeassistant-custom-component - Pydandic version - Pydandic version - WestOxfordshireDistrictCouncil - South Norfolk Council - ForestOfDeanDistrictCouncil - Croydon Council - South Kesteven District Council - #647 - #630 - #623 - #586 - #578 - #389 - #930 - #933 - #750 - Refactor Midlothian Council scraper to use house number and postcode - West Berkshire Council - Southwark Council - #580 - #888 - #902 - #607 ## 0.104.0 (2024-10-20) ### Feat - Adding Luton Borough Council - Adding West Oxfordshire District Council - Adding Aberdeenshire Council - Adding Canterbury City Council - Adding Swindon Borough Council ### Fix - #697 - #694 - #659 - #590 - #900 ## 0.103.0 (2024-10-20) ### Feat - Adding RAW JSON Sensor ### Fix - Black formatting - Black formatting ## 0.102.0 (2024-10-20) ### Feat - Moving from Attributes to Sensors - Moving from Attributes to Sensors ## 0.101.0 (2024-10-20) ### Feat - Add Midlothgian Council ## 0.100.0 (2024-10-18) ### Feat - Adding Dudley Council - Adding South Ribble Council - Plymouth Council - Adding Norwich City Council ### Fix - #744 - #671 - #566 - #749 ## 0.99.1 (2024-10-16) ### Fix - #792 adding web_driver option to Wokingham Council ## 0.99.0 (2024-10-16) ### Feat - Adding Lincoln Council - Adding Tunbridge Wells Council - Adding Perth and Kinross Council ### Fix - Update wiki - #748 - #598 - #572 ## 0.98.5 (2024-10-15) ### Fix - Swale Borough Council - HaltonBoroughCouncil - Barnet Council - WestBerkshireCouncil ## 0.98.4 (2024-10-14) ### Fix - West Suffolk Council - Vale of White Horse Council - Uttlesford District Council - Neath Port Talbot Council - Merton Council - Manchester City Council - Glasgow City Council - BradfordMDC ## 0.98.3 (2024-10-13) ### Fix - EastRiding ## 0.98.2 (2024-10-13) ### Fix - MoleValley ## 0.98.1 (2024-10-13) ### Fix - Barnet and Bexley ## 0.98.0 (2024-10-13) ### Feat - Adding Wirral Council - Adding Lichfield District Council - Adding West Morland And Furness - Adding Walsall Council - Adding Armagh, Banbridge and Craigavon Council ### Fix - #602 - #830 - #870 - #873 - #877 ## 0.97.1 (2024-10-10) ### Fix - NottinghamCityCouncil - #875 ## 0.97.0 (2024-10-10) ### Feat - Adding Falkirk Council ### Fix - #761 ## 0.96.0 (2024-10-10) ### Feat - Adding London Borough Harrow - Adding North Ayrshire Council - Adding Highland Council - Add Elmbridge Borough Council - Adding Southwark Council - South Derbyshire District Council ### Fix - #871 - #869 - #780 - #845 fix: #754 - #835 - #842 ## 0.95.0 (2024-10-09) ### Feat - Adding London Borough of Ealing ## 0.94.0 (2024-10-09) ### Feat - Adding London Borough of Lambeth - Adding Dacorum Borough Council ### Fix - Dacorum Borough Council - East Devon DC ## 0.93.0 (2024-10-08) ### Feat - Update CheshireEastCouncil.py ## 0.92.0 (2024-10-08) ### Feat - Update CheshireEastCouncil.py - Update README.md - Adding Wokingham Borough Council - Adding Winchester City Council - Adding Basildon Council - Adding Colchester City Council ### Fix - RochfordCouncil - Neath Port Talbot Council - Buckinghamshire Council - #639 fix: #812 ## 0.91.2 (2024-10-05) ### Fix - Windsor and Maidenhead Council ## 0.91.1 (2024-10-04) ## 0.91.0 (2024-10-03) ## 0.90.0 (2024-10-03) ### Feat - Adding East Renfrewshire Council ### Fix - Update DorsetCouncil.py - #829 - Update GatesheadCouncil.py - #822 ## 0.89.1 (2024-10-02) ### Fix - High Peak have changed their cookie dialog Seems to be safe to ignore it now. ## 0.89.0 (2024-09-27) ### Feat - Update CheshireEastCouncil.py - Update README.md ### Fix - release to be non pre release ## 0.88.0 (2024-09-16) ### Feat - Add Ealing Council ### Fix - Update README.md ## 0.87.0 (2024-09-10) ### Feat - Add IslingtonCouncil ### Fix - #565 Gloucester city council driver ## 0.86.1 (2024-09-09) ### Fix - #773 Wakefield ## 0.86.0 (2024-09-06) ### Feat - added Rotherham Council ## 0.85.7 (2024-09-05) ### Fix - more unit tests - more unit tests - Chorley ## 0.85.6 (2024-09-03) ### Fix - #795 and add reconfigure to custom comp. ## 0.85.5 (2024-09-03) ## 0.85.4 (2024-09-03) ### Fix - #795 and add reconfigure to custom comp. - #795 Unit Test Coverage ## 0.85.3 (2024-09-02) ### Fix - #795 unit test coverage ## 0.85.2 (2024-09-02) ### Fix - 791 Glasgow URL change ## 0.85.1 (2024-09-02) ### Fix - 779 Add correct async wait to Home Assistant ## 0.85.0 (2024-08-27) ### Feat - support for enfield council ## 0.84.2 (2024-08-27) ### Fix - Re-work North Tyneside Council module for 2024 - some addresses do not have a garden collection - Re-work North Tyneside Council module for 2024 ## 0.84.1 (2024-08-08) ### Fix - #771 Bolton bullet points on dates is now fixed ## 0.84.0 (2024-07-31) ## 0.83.0 (2024-07-07) ### Feat - add has_numbers() function ### Fix - update Gedling Borough Council parser to use alternative name key - change Gedling to use new JSON data - update instructions for Gedling - update input.json to use UPRN parameter - change DorsetCouncil.py to use API links provided in #756 - explicit import of logging.config to stop error in Python 3.11 ## 0.82.0 (2024-06-13) ### Feat - adding dev container updates - adding dev container updates - refactoring main files - adding ability to set local mode in HA custom comp. if users dont have a Selenium Server ### Fix - MidSussex ## 0.81.0 (2024-06-05) ### Feat - Adding Wychavon District Council ### Fix - IntTestWarnings - IntTestWarnings ## 0.80.0 (2024-06-02) ### Feat - Adding Uttlesford District Council - Adding Stafford Boro Council - Adding Swansea Council - Adding New Forest - Adding Three Rivers - Adding Three Rivers ### Fix - ThreeRivers - #425 Entities are not updated - sessions to avoid deprecation - Update docker-image.yml - Update docker-image.yml ## 0.79.1 (2024-05-29) ### Fix - Change CSS class in search for collection types ## 0.79.0 (2024-05-28) ### Feat - Adding Dartford - Adding South Kesteven District Council - Adding ChichesterCouncil - adding HounslowCouncil - adding HounslowCouncil - adding HounslowCouncil - Epping Fix - Adding Epping Forest District Council - Update input.json - Epping Forest District Council - Adding Stroud District Council - Add support for Tendring District Council - #269 Adding Waltham Forest - #269 Adding Waltham Forest - Adding council creation script ### Fix - Update Mole Valley URL ## 0.78.0 (2024-05-26) ### Feat - Add support for Fareham Borough Council ## 0.77.0 (2024-05-26) ### Feat - Add support for Bracknell Forest Council ## 0.76.1 (2024-05-24) ### Fix - Handle Barnet council cookies message ## 0.76.0 (2024-05-24) ### Feat - add bin colour support WestSuffolkCouncil style: black format WestSuffolkCouncil - add bin colour support WestSuffolkCouncil style: black format WestSuffolkCouncil ## 0.75.0 (2024-05-19) ### Feat - #725 Add names to selenium test videos using "se:name" option in create webdriver function ## 0.74.1 (2024-05-18) ### Fix - #693 Cheshire West & Chester Council Sensor Bug - #693 Cheshire West & Chester Council Sensor Bug ## 0.74.0 (2024-05-17) ### Feat - #722 Support Python 3.12 - #722 Support Python 3.12 - #722 Support Python 3.12 ## 0.73.0 (2024-05-17) ### Feat - #708 Adding HA to the dev container for debugging ## 0.72.0 (2024-05-17) ### Feat - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging ## 0.71.0 (2024-05-17) ### Feat - Update for West Suffolk Councils new website ## 0.70.0 (2024-05-17) ### Feat - #708 Dev Container - Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 simplifying Selenium integration tests - #708 simplifying Selenium integration tests - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Dev Container testing - #708 - dev container changes - #706 Adding Dev Container - #706 Adding initial Dev Container ## 0.69.7 (2024-05-17) ### Fix - #713 BarnsleyMBCouncil.py ## 0.69.6 (2024-05-16) ### Fix - #709 Update DoverDistrictCouncil.py ## 0.69.5 (2024-05-14) ### Fix - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 test coverage back to 100% ## 0.69.4 (2024-05-09) ### Fix - pass in required parameter into `create_webdriver` - test runners for `MiltonKeynesCityCouncil` and `NorthEastLincs`. ## 0.69.3 (2024-05-09) ### Fix - fix AttributeError when no garden waste collection is available for properties using Huntingdon District Council - add support for parsing "Today" / "Tomorrow" as date text for `BarnsleyMBCouncil` - add support for parsing "Tomorrow" as date text for `LiverpoolCityCouncil` ## 0.69.1 (2024-05-01) ### Fix - Handling the "Website cookies enhance your user experience." button - Handling the "Website cookies enhance your user experience." button ## 0.69.0 (2024-04-28) ### Feat - Adding Renfrewshire Council - Adding Renfrewshire Council ## 0.68.2 (2024-04-28) ### Fix - Remove 'import Dumper' ## 0.68.1 (2024-04-27) ### Fix - input.json Bradford missing comma ## 0.68.0 (2024-04-27) ### Feat - Add support for West Berkshire Council - add support for Knowsley Metropolitan Borough Council - add support for Cheshire West and Chester Council - add support for Cheshire West and Chester Council ## 0.66.2 (2024-04-18) ### Fix - Update HaringeyCouncil.py issue #670 ## 0.66.1 (2024-04-15) ### Fix - parse datetimes correctly and round to midnight ## 0.66.0 (2024-04-15) ## 0.65.2 (2024-04-15) ### Fix - change address selection to fix errors selecting the user's PAON ## 0.65.1 (2024-04-15) ### Fix - add check for parsed string length to stop datetime parsing error ## 0.65.0 (2024-04-13) ### Feat - add Arun council - add support for Sunderland City Council - add support for Sunderland City Council ## 0.64.3 (2024-03-25) ### Fix - sort data and correct dictionary name (#609) ## 0.64.2 (2024-03-24) ## 0.64.1 (2024-03-24) ### Fix - fix Kirklees address search (switch to house & postcode) - fixes json ## 0.64.0 (2024-03-23) ### Feat - add Kirklees council ### Fix - fixes json ## 0.63.0 (2024-03-23) ### Feat - Add Solihull Council (#513) - Add Adur and Worthing Councils (#544) - Add Dover District Council (#614) - Add Rochford Council (#620) - Add Tandridge District Council (#621) - Add West Northamptonshire Council (#567) - Add Hull City Council (#622) - Add Wyre Council (#625) - Add Telford and Wrekin Co-operative Council (#632) - Add Mansfield District Council (#560) - Add Bedford Borough Council (#552) ### Fix - spacing on input.json - realign input.json - capitalize bin type text - formatting on input.json - incorrect collections - update testing URL for Merton - attempt to resolve invisible banner hiding postcode box - resolve JSON schema exception for date formatting - resolve JSON schema exception for date formatting - accept cookies banner ## 0.62.0 (2024-03-03) ### Fix - Added missing .feature file entry to the test config for NewhamCouncil ## 0.61.1 (2024-02-16) ### Fix - code optimisations - Fix date parsing in WestLindseyDistrictCouncil.py ## 0.61.0 (2024-02-11) ### Feat - Add Mole Valley District Council ## 0.60.1 (2024-02-03) ### Fix - Update input.json Closes #599 ## 0.60.0 (2024-01-28) ### Feat - Add Scraper for St Albans City and District Council ## 0.59.1 (2024-01-25) ### Fix - add wiki note for castlepoint - update test data for castlepoint - remove single line causing issues ## 0.59.0 (2024-01-20) ### Feat - Add NorthYorkshire to test feature file - Add north yorkshire to test input - Add Support for north yorkshire council ### Fix - remove unused code ## 0.58.8 (2024-01-19) ### Fix - barnet no overrides ## 0.58.7 (2024-01-18) ### Fix - accidentally returned strings when needed date objects, refactor to handle this - checking for future/past dates ## 0.58.6 (2024-01-18) ### Fix - correct date handling for North West Leicestershire ## 0.58.5 (2024-01-15) ### Fix - Don't call driver.quit where already handled by finally block ## 0.58.4 (2024-01-15) ### Fix - remove extra driver.quit to prevent errors ## 0.58.3 (2024-01-15) ### Feat - Added support for Newham Council's bin collections ### Fix - Add a default value for user_agent to fix all councils using selenium and not specifying agent ## 0.58.2 (2024-01-11) ### Fix - use static values for bin types ## 0.58.1 (2024-01-10) ### Fix - Eastleigh Borough Council doesnt cope with "You haven't yet signed up for ..." - Eastleigh Borough Council doesnt cope when Garden Waste service hasn't been signed up for, which gets the value "You haven't yet signed up for our garden waste collections. Find out more about our\xa0garden waste collection service" which results in ValueError: time data ## 0.58.0 (2024-01-10) ### Feat - Add Test Valley Borough Council ## 0.57.0 (2024-01-09) ### Feat - Add support for Chorley Council ## 0.56.13 (2024-01-09) ### Fix - update logic to account for council website change ## 0.56.12 (2024-01-09) ### Fix - duplicate driver.quit() calls causes error ## 0.56.11 (2024-01-08) ### Fix - Headless now working on custom comp Update sensor.py ## 0.56.10 (2024-01-08) ### Fix - headless mode in custom component ## 0.56.9 (2024-01-08) ### Fix - headless mode ## 0.56.8 (2024-01-08) ### Fix - headless in custom comp ## 0.56.7 (2024-01-08) ### Fix - headless options ## 0.56.6 (2024-01-07) ### Fix - modified Kingston-upon-Thames driver for greater reliability. ## 0.56.5 (2024-01-07) ### Fix - Update KingstonUponThamesCouncil.py ## 0.56.4 (2024-01-07) ### Fix - Update KingstonUponThamesCouncil.py ## 0.56.3 (2024-01-07) ### Fix - headless options - #542 - Selenium Grid Sessions must be terminated cleanly - #542 - Selenium Grid Sessions must be terminated cleanly ## 0.56.2 (2024-01-07) ### Fix - Update strings.json - Update en.json - Update config_flow.py ## 0.56.1 (2024-01-07) ### Fix - Update common.py ## 0.56.0 (2024-01-07) ### Feat - Update strings.json - Update en.json - Update config_flow.py - adding headless control - adding headless control - adding headless control ## 0.55.3 (2024-01-05) ### Fix - Update lint.yml ## 0.55.2 (2024-01-05) ### Fix - Chelmsford ## 0.55.1 (2024-01-05) ### Fix - Update ChelmsfordCityCouncil.py - Update ChelmsfordCityCouncil.py - Update ChelmsfordCityCouncil.py ## 0.155.0 (2025-10-11) ### Feat - Create tag-on-merge.yml - Update bump.yml - fix bump.yml - Update TorbayCouncil.py - Update bump.yml - fix release pipeline bump.yml - fix Torbay - fix releases process ### Fix - Update AberdeenCityCouncil.py - Update TorbayCouncil.py - Update TorbayCouncil.py - Update TorbayCouncil.py - Update URL for NewForestCouncil - New URL and page for wheelie bins - improve Mid Suffolk District Council holiday handling with dynamic bank holiday detection - Oxford now rejects the "Requests" default user agent - #1557 - Adding East Dunbartonshire - #1557 - Adding East Dunbartonshire - #1569 - Somerset Council - #1569 - Somerset Council - #1559 - Newport City Council - #1559 - Newport City Council - #1574 - Test Valley Borough Council - #1574 - Test Valley Borough Council - #1566 South Gloucestershire Council ## 0.154.0 (2025-09-21) ### Feat - handle changes to northumberland council website - modify input for NorthumberlandCouncil to accept uprn instead of house number, and use new page structure ### Fix - the cookie banner is not optional - #1570 - Slough Borough Council - #1570 - Slough Borough Council - #1520 - Erewash Borough Council - #1520 - Erewash Borough Council - #1554 - Folkestone and Hythe District Council - #1554 - Folkestone and Hythe District Council - #1604 - West Berkshire Council - #1604 - West Berkshire Council - #1606 - Brighton and Hove City Council - #1606 - Brighton and Hove City Council - #1565 - BCP Council - #1565 - BCP Council - #1571 - Castle Point District Council - #1571 - Castle Point District Council - #1584 - NorthHertfordshireDistrictCouncil - #1584 - NorthHertfordshireDistrictCouncil - #1599 - #1599 - Basingstoke Council - #1587 - #1587 - Hartlepool Borough Council - #1588 - #1588 Glasgow City Council - #1591 - #1591 Rushmoor Council ## 0.153.0 (2025-09-02) ### Feat - Change buckinghamshire council to get data from endpoint ### Fix - 1573 Update Bolton council URL - East Herts Council - #1575 - Runnymede Borough Council - #1513 - Wiltshire Council - #1533 - Staffordshire Moorlands District Council - #1535 - Ipswich Borough Council - #1548 - North East Lincs - Hinckley and Bosworth Borough Council - Nuneaton Bedworth Borough Council - #1514 - Lichfield District Council - 1549 ## 0.152.11 (2025-08-25) ### Fix - date extraction in RochfordCouncil data parsing - parsing error in BH selenium - **hacs**: respect the headless option ### Refactor - **hacs**: improve build_ukbcd_args with formatter functions ## 0.152.10 (2025-08-04) ### Fix - Gateshead and East Lothian - Enfield and Broxbourne - East Herts - FermanaghOmaghDistrictCouncil - Enfield and Broxbourne - East Herts ## 0.152.9 (2025-08-03) ### Fix - Cotswald and coventry - Fixing multiple broken councils - multiple broken councils ## 0.152.8 (2025-07-26) ### Fix - Add headers to request for Swindon Borough Council - Add headers to requests for Royal Borough of Greenwich Fixes #1496 by ensuring that the requests are not rejected due to lack of headers. - **MidlothianCouncil**: add request headers to resolve 403 Forbidden ## 0.152.7 (2025-07-01) ### Fix - maidstone selenium fix ## 0.152.6 (2025-06-18) ### Fix - removed In Progress from date - removed a degub print statement - **RugbyBoroughCouncil**: Amended parsed date from full to abbreviated month date, may worked but jun and jul did not - **RugbyBoroughCouncil**: Amended parsed date - Reworked Cumberland Council to cater for postcode addition - **OxfordCityCouncil**: Fixed Oxford City Council parsing dues to changes in output from the website ## 0.152.5 (2025-06-07) ### Fix - South Ribble and version pinning issues for input.json ## 0.152.4 (2025-06-07) ### Fix - **SouthRibble**: Corrected Date formatting issue - **SouthRibble**: Resolved South Ribble without selenium ## 0.152.3 (2025-06-04) ### Fix - NorthHertfordshire selenium script - Adur council - Eastleigh date fix - removed duplicates in BradfordMDC ## 0.152.2 (2025-06-04) ### Fix - Update Makefile - Update CheshireEastCouncil.py - Github action to handle branch name with parentheses ## 0.152.1 (2025-05-15) ### Fix - Update to fix North Somerset - Glasgow SSL bypass - more robust Northumberland - updated Eastleigh input.json - Eastleigh cloudflare fix - converted collection datetimes into dates for BH parsing. - Eastleigh cloudflare fix - Eastleigh cloudflare fix - added check_uprn to simplified councils - simplified Swindon - simplified East Devon - simplified Dover - Simplified Dartford - simplified Cheshire East - simplified Charnwood input.json - improved Charnwood - Adur Worthing fix - Chorley simplification - Bexley simplification - added URL to Torbay script - Guildford fixes - reworked Maidstone - maidstone input.json - Croydon selenium version - Stoke date-time fix ## 0.152.0 (2025-05-02) ### Feat - Added Fermanagh Omagh - Added Twekesbury - added Slough council - Added Argus Council - added Angus to input.json ### Fix - Chichester now only requires postcode and house number - Broadland now only requires postcode and house number - Barking now only requires postcode and house number - Brighton now only requires postcode and house number - ensured all bins for this council - added skip_get_url to hyndburn ## 0.151.0 (2025-04-27) ### Feat - version bump ### Fix - more robust brent date handling - input.json requires web_driver - Rugby fix - simplified blackburn ## 0.150.0 (2025-04-27) ### Feat - added melton ### Fix - added melton - processed all bins for Moray ## 0.149.0 (2025-04-27) ### Feat - added pembrokeshire ## 0.148.6 (2025-04-27) ### Fix - updated input.json - cloudflare fix - switch to selenium method ## 0.148.5 (2025-04-27) ### Fix - output check - parsed bin info - selenium navigation - input.json changes ## 0.148.4 (2025-04-27) ### Fix - used canonical 'nice name' ## 0.148.3 (2025-04-25) ### Fix - working hyndburn - hyndburn input.json ## 0.148.2 (2025-04-24) ### Fix - Update docker-compose.yml ## 0.148.1 (2025-04-22) ### Fix - added bank holiday offsets. - added bank holiday offsets. ## 0.148.0 (2025-04-19) ### Feat - adding Wrexham and #1046 Horsham councils ### Fix - Argyll and Bute council #1053 ## 0.147.2 (2025-04-18) ### Fix - wait for element to be clickable ## 0.147.1 (2025-04-18) ### Fix - #1351 - moved geopandas to petry dev ## 0.147.0 (2025-04-18) ### Feat - add council tests results map ## 0.146.2 (2025-04-18) ### Fix - adding map checking and matching ## 0.146.1 (2025-04-18) ### Fix - more robust bank holiday handling ## 0.146.0 (2025-04-18) ### Feat - #1342 Adding Includes Trafford, Clackmannanshire, Havant, North Warwickshire, Newry Mourne and Down, East Dunbartonshire, Pendle, Torfaen, East Hampshire, Ribble Valley, Brentwood, Isle of Wight, Westmorland and Furness, Derry and Strabane, and Norwich. Google Cal support for PDF councils via ICS file ### Fix - Black reformatting ## 0.145.0 (2025-04-18) ### Feat - Adding PDF councils ## 0.144.4 (2025-04-18) ### Fix - Bristol #1275 ## 0.144.3 (2025-04-17) ### Fix - better address for input.json - bank holiday overrides - more robust address searching - simple parsing done - Selenium navigation ## 0.144.2 (2025-04-17) ### Fix - knowsley - knowsley - knowsley - knowsley - KnowsleyMBCouncil.py - #1220 adding Mid Ulster District Council ## 0.144.1 (2025-04-17) ### Fix - fix Sandwell garden waste collection date ## 0.144.0 (2025-04-17) ### Feat - added great yarmouth ## 0.143.6 (2025-04-17) ### Fix - Renfrewshire Council ## 0.143.5 (2025-04-17) ### Fix - Google Cal ## 0.143.4 (2025-04-17) ### Fix - Google Cal ## 0.143.3 (2025-04-15) ### Fix - #1301 Fix Leeds Council ## 0.143.2 (2025-04-15) ### Fix - #1301 Fix Leeds Council ## 0.143.1 (2025-04-15) ### Fix - Set the bin_type when different day ## 0.143.0 (2025-04-13) ### Fix - corrected url in input.json - fixed input.json - parsed Barking Dagenham collection information - selenium navigation Barking ## 0.142.0 (2025-04-13) ### Feat - Added Stirling Council ### Fix - typo in input.json ## 0.141.4 (2025-04-13) ### Fix - #1304 - sesnors goes to unknown if the data is blank from councils who are less reliable ## 0.141.3 (2025-04-13) ### Fix - Newham council ## 0.141.2 (2025-04-13) ### Fix - Newham council - Newham council ## 0.141.1 (2025-04-12) ### Fix - missing finally block on selenium tests ## 0.141.0 (2025-04-12) ### Feat - #1185 Adding PeterboroughCity Council ## 0.140.0 (2025-04-11) ### Feat - Added Broadland District Council ### Fix - cleanup of council file - added Broadland to input.json ## 0.139.0 (2025-04-07) ### Feat - adding #1037 - adding #1032 North Devon Count Council ### Fix - #1296 Forest of dean - 939 adding South Holland District Council - Lincolnshire UK ## 0.138.1 (2025-04-05) ### Fix - Walhtam forest council - revert previous changes ## 0.138.0 (2025-04-05) ### Feat - Adding Hastings Borough Council - Adding Fylde Council ### Fix - #1249 - #1039 fix: #1181 fix: #1266 fix: #1274 - Gloucester City Council - #1282 - Mid Devon Council - #1277 fix: #1287 - West Oxfordshire Council - #1290 ## 0.137.0 (2025-04-05) ### Feat - #816 adding trafford council ## 0.136.0 (2025-03-24) ### Feat - Adding Southampton City Council - Adding Cambridge City Council - Adding Spelthorne Borough Council ### Fix - #1057 - #1264 - #1270 - Bexley Council - #1256 - HinckleyandBosworthBoroughCouncil - #1207 - Hackney Council - #1230 - Castlepoint District Council - #1252 - Canterbury City Council - #1254 ## 0.135.4 (2025-03-24) ### Fix - parse scheduleCodeWorkflowIDs instead of scheduleCodeWorkflowID for Hackney Council ## 0.135.3 (2025-02-23) ## 0.135.2 (2025-02-19) ### Fix - North Yorkshire - multiple bins on a day ## 0.135.1 (2025-02-18) ### Fix - devcontainer ## 0.135.0 (2025-02-17) ### Feat - Adding Runnymede Borough Council - Adding Cherwell District Council - Adding Epsom and Ewell Borough Council - Adding Redcar and Cleveland Council - Adding Amber Valley Borough Council - Adding Bolsover Council ### Fix - #833 adding Middlesbrough and check script for Selenium - Cotswold District Council - #1238 - Leeds City Council - #1222 - Derbyshire Dales District Council - Conwy County Borough - Sunderland City Council - #1219 - Tendring District Council - #1221 - #1214 - #923 - #895 - #841 - #903 - #990 - Torridge District Council - #1204 - Neath Port Talbot - #1213 ## 0.134.3 (2025-02-15) ### Fix - Update input.json - 1235 Councils missing Selenium in input.json ## 0.134.2 (2025-02-15) ### Fix - 1232 East herts missing Selenium url in input.json ## 0.134.1 (2025-02-11) ### Fix - Cheltenham Borough Council - #1061 ## 0.134.0 (2025-02-07) ### Feat - Ipswich Borough Council - trying different address - Ipswich Borough Council - correcting param name in input.json - Ipswich Borough Council - added input.json values and refactored code - Ipswich Borough Council - initial implementation ## 0.133.0 (2025-02-02) ### Feat - adding manual refresh ## 0.132.0 (2025-02-02) ### Feat - adding manual refresh ## 0.131.0 (2025-02-02) ### Feat - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding manual refresh - adding unit tests for the new manual refresh - adding manual refresh control ## 0.130.1 (2025-01-30) ### Fix - slow councils ## 0.130.0 (2025-01-29) ### Feat - Add Herefordshire Council (closes: #1011) ### Fix - Fix spacing in wiki name ## 0.129.0 (2025-01-29) ### Feat - Adding East Staffordshire Borough Council - Adding Boston Borough Council ### Fix - input.json - input.json - Adding East Lothian Council - #1171 - #1052 fix: #1083 - Leicester City Council - #1178 - Cardiff Council - #1175 - Newcastle City Council - #1179 - #1180 - Midlothian Council - #1192 - Adding Next Page support - Swale Borough Council - #1139 ## 0.128.6 (2025-01-29) ### Fix - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting - moving away from broken Allure reporting ## 0.128.5 (2025-01-29) ### Fix - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update CheshireEastCouncil.py ## 0.128.4 (2025-01-28) ### Fix - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml ## 0.128.3 (2025-01-28) ### Fix - Update CheshireEastCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml ## 0.128.2 (2025-01-28) ### Fix - Add communal recycling and communal rubbish - Add garden waste to Merton Council ## 0.128.1 (2025-01-28) ### Fix - Update AberdeenshireCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml ## 0.128.0 (2025-01-28) ### Feat - implement Medway Council (#1021) ### Fix - Forgot to include skip_get_url ## 0.127.4 (2025-01-25) ### Fix - NewForestCouncil ## 0.127.3 (2025-01-16) ### Fix - Vale of White Horse - #1156 - South Oxfordshire Council - #1158 - Surrey Heath Borough Council - #1164 - Carmarthenshire County Council - #1167 - Glasgow City Council - #1166 - Merton Council - NewarkAndSherwoodDC - Rushcliffe Borough Council - Powys Council - Staffordshire Moorlands District Council - Stroud District Council - Vale of Glamorgan Council - West Oxfordshire District Council ## 0.127.2 (2025-01-13) ### Fix - Update bin type to be the full string ## 0.127.1 (2025-01-10) ### Fix - Use visibility of list rather than existence - Update Rushcliffe Borough Council input elements and flow ## 0.127.0 (2025-01-07) ### Feat - Adding Oadby And Wigston Borough Council - Add Gwynedd Council - Adding Denbighshire Council - Adding Dundee City Council - Adding Brent Council - Adding West Dunbartonshire Council - Adding Cumberland Council ### Fix - #929 - Cornwall Council - #1137 - #1125 - #1106 - #1108 - #1109 - #1134 - Northumberland Council - #1082 - #1110 - Waltham Forest - #1126 - London Borough Sutton - #1131 - Kirklees Council - #1129 - Breaking Change. UPRN required ## 0.126.2 (2025-01-07) ### Fix - **tests**: updates test case url for coventry city council - **tests**: removes duplicate key for coventry city council - updates coventry city council button text ## 0.126.1 (2025-01-06) ### Fix - behave_testing - behave_testing ## 0.126.0 (2025-01-04) ### Feat - #1063 - rewrite Kirklees Council parser for new website - #1067 - Add garden bin collections where available for Norwich City Council - Adding Wandsworth Council ### Fix - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Swale Borough Council - #1080 (cherry picked from commit 6f580b39fb68b8079990221e050ae8dd6d2b7285) - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update ArdsAndNorthDownCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Update WestLindseyDistrictCouncil.py - #1101 - Fix table parsing for Walsall Council - Remove invalid escape sequence warnings from West Lindsey District Council - #1073 - change method of generating bin types to avoid manual mapping for Rugby Borough Council - add missing backticks to separate colour config and standard usage instructions - #1078 (cherry picked from commit 89d93666bb659010d1c130b98c1d81c6ff80cf7c) - change date format to project default for Merton Council - correct date logic for Swale Borough Council - Merton Council - London Borough Sutton - #1076 (cherry picked from commit 1eab20c9a57c9c4438ea343f374202bb2e9b98ca) - Swale Borough Council - #1080 (cherry picked from commit 6f580b39fb68b8079990221e050ae8dd6d2b7285) - correct date/year logic for West Lindsey District Council - replace West Lindsey's input with working address - #1089 - Correct shifted dates in Bromley Borough Council - remove WDM import - #1087 - Food waste date incorrect for West Berkshire Council ## 0.125.2 (2025-01-04) ### Fix - Update ArdsAndNorthDownCouncil.py - Update behave_schedule.yml - Update behave_pull_request.yml - Update README.md to have links to Full and Partial Integration Test Reports - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml - Update behave_pull_request.yml - Update behave_pull_request.yml - Update behave_schedule.yml - Update behave_schedule.yml ## 0.125.1 (2025-01-04) ### Fix - correctly handle year increment for January dates ## 0.125.0 (2025-01-04) ### Feat - Adding Redditch Borough Council - Adding Blaenau Gwent County Borough Council - Adding Wandsworth Council ### Fix - #1068 - #1098 - Wiltshire Council - #1094 - Salford City Council - #1097 - #1078 - Merton Council - Swale Borough Council - #1080 - London Borough Sutton - #1076 - Update behave_schedule.yml - Update bump.yml ## 0.124.4 (2025-01-04) ### Fix - Update behave_schedule.yml ## 0.124.3 (2025-01-04) ### Fix - allure reporting - allure reporting - allure reporting ## 0.124.2 (2025-01-03) ### Fix - Update behave.yml ## 0.124.1 (2025-01-03) ### Fix - avoid crashing on unexpected string value ## 0.124.0 (2025-01-02) ### Feat - Hart District Council ## 0.123.2 (2024-12-19) ### Fix - Update behave.yml ## 0.123.1 (2024-12-18) ### Fix - Update AberdeenCityCouncil.py - Update behave.yml ## 0.123.0 (2024-12-17) ## 0.122.0 (2024-12-04) ### Feat - Adding Monmouthshire County Council - Adding Hinckley and Bosworth Borough Council ### Fix - Glasgow City Council - Merton Council - Blaby District Council - Warwick District Council - Blackburn Council - Carmarthenshire County Council - High Peak Council - CarmarthenshireCountyCouncil ## 0.121.1 (2024-12-03) ### Fix - London Borough of Lewisham to have more reliable parsing of dates ## 0.121.0 (2024-11-24) ### Feat - Royal Borough of Greenwich - Adding London Borough of Lewisham - Adding Hackney Council - Adding Sandwell Borough Council - Adding Moray Council - Adding Kings Lynn and West Norfolk Borough Council - Adding Wyre Forest District Council - Adding Folkstone and Hythe District Council - Adding Cheltenham Borough Council - Adding Thurrock Council ### Fix - West Northamptonshire Council - East Ayrshire Council - Cotswold District Council ## 0.120.0 (2024-11-20) ### Feat - Adding Hartlepool Borough Council - Adding Newcastle Under Lyme Council - Adding London Borough of Havering - Add Garden collection to EnvironmentFirst - Adding Cumberland Council (Allerdale District) - Adding North Hertfordshire District Council ### Fix - #844 - #778 - #769 - #1025 - Mid Siffolk and Babergh Garden Collection Day - #1026 This will require the use of a DAY to be added to the UPRN field - #1029 - #1028 ## 0.119.0 (2024-11-20) ### Feat - Adding Braintree District Council - Adding Burnley Borough Council - Adding Exeter City Council - Adding Edinburgh City Council - Adding Aberdeen City Council ### Fix - #699 - #1015 - #1017 - #894 - #1019 ## 0.118.0 (2024-11-15) ### Feat - Adding Wolverhampton City Council - Adding Stevenage Borough Council - Adding Thanet District Council - Adding Copeland Borough Council - Adding South Hams District Council - Adding Rother District Council ### Fix - #966 - #989 - #1004 - #1006 - #1008 - Rother District Council - #1009 - CrawleyBoroughCouncil - #1005 - Adding Garden collection to Babergh and MidSuffolk Council - #995 ## 0.117.0 (2024-11-13) ### Feat - Adding South Staffordshire District Council fix: #885 ## 0.116.0 (2024-11-12) ### Feat - Adding Ashfield District Council - Adding Gravesham Borough Council - Adding Argyll and Bute Council ### Fix - #579 - #991 - #692 - CheshireWestAndChesterCouncil - #993 - Milton Keynes - #702 - Adding Babergh and Mid Suffolk District Councils - #868 fix: #919 - Adding Derby City Council - #987 ## 0.115.0 (2024-11-11) ### Feat - Adding Warrington Borough Council - Adding Antrim And Newtonabbey Council - Adding Hertsmere Borough Council - Adding West Lancashire Borough Council - Broxbourne Council ### Fix - #695 - #969 - #776 - #980 - #982 - Bradford MDC - #984 ## 0.114.6 (2024-11-09) ### Fix - NBBC Date Fix ## 0.114.5 (2024-11-08) ### Fix - migration logging and debugging ## 0.114.4 (2024-11-08) ### Fix - migration not working - migration not working ## 0.114.3 (2024-11-08) ### Fix - fix manifest in custom component ## 0.114.2 (2024-11-08) ### Fix - #975 adding routine to handle migration error - #975 adding routine to handle migration error ## 0.114.1 (2024-11-08) ### Fix - Update manifest.json ## 0.114.0 (2024-11-07) ### Feat - Nuneaton and Bedworth Borough Council ## 0.113.0 (2024-11-07) ### Feat - adding calendar for Bins in Custom Component ## 0.112.1 (2024-11-07) ### Fix - #767 BREAKING CHANGE - READD your sensors / config ## 0.112.0 (2024-11-06) ### Feat - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes - Adding Powys Council - Adding Worcester City Council - Adding Ards and North Down Council - Adding East Herts Council - Adding Ashford Borough Council - Adding Stockton On Tees Council - Adding Fife Council - Adding Flintshire County Council - Adding Teignbridge Council - Adding Harborough District Council - Adding Watford Borough Council - Adding Coventry City Council ### Fix - CC testing and add Chesterfield - CC testing and add Chesterfield - CC testing and add Chesterfield - pytest-homeassistant-custom-component - Pydandic version - Pydandic version - WestOxfordshireDistrictCouncil - South Norfolk Council - ForestOfDeanDistrictCouncil - Croydon Council - South Kesteven District Council - #647 - #630 - #623 - #586 - #578 - #389 - #930 - #933 - #750 - Refactor Midlothian Council scraper to use house number and postcode - West Berkshire Council - Southwark Council - #580 - #888 - #902 - #607 ## 0.111.0 (2024-11-06) ### Fix - Add London Borough of Sutton - #944 - Add Mid Devon Council - #945 - Adding Oxford City Council - #962 - Tunbridge Wells / Lincoln - #963 - Glasgow City Council ## 0.110.0 (2024-11-04) ### Fix - Adding Blaby District Council - #904 - Adding Sefton Council - #770 - Adding Bromsgrove District Council - #893 - East Lindsey District Council - #957 - Adding Carmarthenshire County Council - #892 fix: #710 - Adding East Ayrshire Council - #955 ## 0.109.2 (2024-11-03) ### Fix - CC testing and add Chesterfield ## 0.109.1 (2024-11-03) ### Fix - CC testing and add Chesterfield - CC testing and add Chesterfield ## 0.109.0 (2024-11-02) ### Feat - Adding Cotswold District Council - Adding Breckland Council ### Fix - St Helens Borough Council - #753 - NewarkAndSherwoodDC - #941 - #658 - #656 ## 0.108.2 (2024-11-01) ### Fix - pytest-homeassistant-custom-component ## 0.108.1 (2024-11-01) ### Fix - Pydandic version - Pydandic version ## 0.108.0 (2024-11-01) ### Feat - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes - pytest fixes - Python 3.12 only and CustomComp. Unit testing ## 0.107.0 (2024-10-31) ### Feat - Adding Powys Council - Adding Worcester City Council - Adding Ards and North Down Council - Adding East Herts Council - Adding Ashford Borough Council ### Fix - WestOxfordshireDistrictCouncil - South Norfolk Council - ForestOfDeanDistrictCouncil - Croydon Council - South Kesteven District Council - #647 - #630 - #623 - #586 - #578 - #389 ## 0.106.0 (2024-10-28) ### Feat - Adding Stockton On Tees Council - Adding Fife Council - Adding Flintshire County Council ### Fix - #930 - #933 - #750 - West Berkshire Council - Southwark Council ## 0.105.1 (2024-10-24) ### Fix - Refactor Midlothian Council scraper to use house number and postcode ## 0.105.0 (2024-10-21) ### Feat - Adding Teignbridge Council - Adding Harborough District Council - Adding Watford Borough Council - Adding Coventry City Council ### Fix - #580 - #888 - #902 - #607 ## 0.104.0 (2024-10-20) ### Feat - Adding Luton Borough Council - Adding West Oxfordshire District Council - Adding Aberdeenshire Council - Adding Canterbury City Council - Adding Swindon Borough Council ### Fix - #697 - #694 - #659 - #590 - #900 ## 0.103.0 (2024-10-20) ### Feat - Adding RAW JSON Sensor ### Fix - Black formatting - Black formatting ## 0.102.0 (2024-10-20) ### Feat - Moving from Attributes to Sensors - Moving from Attributes to Sensors ## 0.101.0 (2024-10-20) ### Feat - Add Midlothgian Council ## 0.100.0 (2024-10-18) ### Feat - Adding Dudley Council - Adding South Ribble Council - Plymouth Council - Adding Norwich City Council ### Fix - #744 - #671 - #566 - #749 ## 0.99.1 (2024-10-16) ### Fix - #792 adding web_driver option to Wokingham Council ## 0.99.0 (2024-10-16) ### Feat - Adding Lincoln Council - Adding Tunbridge Wells Council - Adding Perth and Kinross Council ### Fix - Update wiki - #748 - #598 - #572 ## 0.98.5 (2024-10-15) ### Fix - Swale Borough Council - HaltonBoroughCouncil - Barnet Council - WestBerkshireCouncil ## 0.98.4 (2024-10-14) ### Fix - West Suffolk Council - Vale of White Horse Council - Uttlesford District Council - Neath Port Talbot Council - Merton Council - Manchester City Council - Glasgow City Council - BradfordMDC ## 0.98.3 (2024-10-13) ### Fix - EastRiding ## 0.98.2 (2024-10-13) ### Fix - MoleValley ## 0.98.1 (2024-10-13) ### Fix - Barnet and Bexley ## 0.98.0 (2024-10-13) ### Feat - Adding Wirral Council - Adding Lichfield District Council - Adding West Morland And Furness - Adding Walsall Council - Adding Armagh, Banbridge and Craigavon Council ### Fix - #602 - #830 - #870 - #873 - #877 ## 0.97.1 (2024-10-10) ### Fix - NottinghamCityCouncil - #875 ## 0.97.0 (2024-10-10) ### Feat - Adding Falkirk Council - Adding London Borough Harrow - Adding North Ayrshire Council ### Fix - #761 - #871 - #869 ## 0.96.0 (2024-10-10) ### Feat - Adding Highland Council - Add Elmbridge Borough Council - Adding Southwark Council - South Derbyshire District Council ### Fix - #780 - #845 fix: #754 - #835 - #842 ## 0.95.0 (2024-10-09) ### Feat - Adding London Borough of Ealing ## 0.94.0 (2024-10-09) ### Feat - Adding London Borough of Lambeth - Adding Dacorum Borough Council ### Fix - Dacorum Borough Council - East Devon DC ## 0.93.0 (2024-10-08) ### Feat - Update CheshireEastCouncil.py ## 0.92.0 (2024-10-08) ### Feat - Update CheshireEastCouncil.py - Update README.md - Adding Wokingham Borough Council - Adding Winchester City Council - Adding Basildon Council - Adding Colchester City Council ### Fix - RochfordCouncil - Neath Port Talbot Council - Buckinghamshire Council - #639 fix: #812 ## 0.91.2 (2024-10-05) ### Fix - Windsor and Maidenhead Council ## 0.91.1 (2024-10-04) ### Fix - Update DorsetCouncil.py - #829 - Update GatesheadCouncil.py - #822 ## 0.91.0 (2024-10-03) ### Feat - Adding East Renfrewshire Council ## 0.90.0 (2024-10-03) ## 0.89.1 (2024-10-02) ### Fix - High Peak have changed their cookie dialog Seems to be safe to ignore it now. ## 0.89.0 (2024-09-27) ### Feat - Update CheshireEastCouncil.py - Update README.md ### Fix - release to be non pre release ## 0.88.0 (2024-09-16) ### Feat - Add Ealing Council ### Fix - Update README.md ## 0.87.0 (2024-09-10) ### Feat - Add IslingtonCouncil ## 0.86.2 (2024-09-09) ### Fix - #565 Gloucester city council driver ## 0.86.1 (2024-09-09) ### Fix - #773 Wakefield ## 0.86.0 (2024-09-06) ### Feat - added Rotherham Council ## 0.85.7 (2024-09-05) ### Fix - more unit tests - more unit tests - Chorley ## 0.85.6 (2024-09-03) ### Fix - #795 and add reconfigure to custom comp. ## 0.85.5 (2024-09-03) ### Fix - #795 and add reconfigure to custom comp. ## 0.85.4 (2024-09-03) ### Fix - #795 Unit Test Coverage ## 0.85.3 (2024-09-02) ### Fix - #795 unit test coverage ## 0.85.2 (2024-09-02) ### Fix - 791 Glasgow URL change ## 0.85.1 (2024-09-02) ### Fix - 779 Add correct async wait to Home Assistant ## 0.85.0 (2024-08-27) ### Feat - support for enfield council ## 0.84.2 (2024-08-27) ### Fix - Re-work North Tyneside Council module for 2024 - some addresses do not have a garden collection - Re-work North Tyneside Council module for 2024 ## 0.84.1 (2024-08-08) ### Fix - #771 Bolton bullet points on dates is now fixed ## 0.84.0 (2024-07-31) ## 0.83.0 (2024-07-07) ### Feat - add has_numbers() function ### Fix - update Gedling Borough Council parser to use alternative name key - change Gedling to use new JSON data - update instructions for Gedling ## 0.82.1 (2024-06-28) ### Fix - update input.json to use UPRN parameter - change DorsetCouncil.py to use API links provided in #756 - explicit import of logging.config to stop error in Python 3.11 ## 0.82.0 (2024-06-13) ### Feat - adding dev container updates - adding dev container updates - refactoring main files - adding ability to set local mode in HA custom comp. if users dont have a Selenium Server ### Fix - MidSussex ## 0.81.0 (2024-06-05) ### Feat - Adding Wychavon District Council ### Fix - IntTestWarnings - IntTestWarnings ## 0.80.0 (2024-06-02) ### Feat - Adding Uttlesford District Council - Adding Stafford Boro Council - Adding Swansea Council - Adding New Forest - Adding Three Rivers - Adding Three Rivers ### Fix - ThreeRivers - #425 Entities are not updated - sessions to avoid deprecation - Update docker-image.yml - Update docker-image.yml ## 0.79.1 (2024-05-29) ### Fix - Change CSS class in search for collection types ## 0.79.0 (2024-05-28) ### Feat - Adding Dartford - Adding South Kesteven District Council - Adding ChichesterCouncil - adding HounslowCouncil - adding HounslowCouncil - adding HounslowCouncil - Epping Fix - Adding Epping Forest District Council - Update input.json - Epping Forest District Council - Adding Stroud District Council - Add support for Tendring District Council - #269 Adding Waltham Forest - #269 Adding Waltham Forest - Adding council creation script ### Fix - Update Mole Valley URL ## 0.78.0 (2024-05-26) ### Feat - Add support for Fareham Borough Council ## 0.77.0 (2024-05-26) ### Feat - Add support for Bracknell Forest Council ## 0.76.1 (2024-05-24) ### Fix - Handle Barnet council cookies message ## 0.76.0 (2024-05-24) ### Feat - add bin colour support WestSuffolkCouncil style: black format WestSuffolkCouncil - add bin colour support WestSuffolkCouncil style: black format WestSuffolkCouncil ## 0.75.0 (2024-05-19) ### Feat - #725 Add names to selenium test videos using "se:name" option in create webdriver function ## 0.74.1 (2024-05-18) ### Fix - #693 Cheshire West & Chester Council Sensor Bug - #693 Cheshire West & Chester Council Sensor Bug ## 0.74.0 (2024-05-17) ### Feat - #722 Support Python 3.12 - #722 Support Python 3.12 - #722 Support Python 3.12 ## 0.73.0 (2024-05-17) ### Feat - #708 Adding HA to the dev container for debugging ## 0.72.0 (2024-05-17) ### Feat - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging - #708 Adding HA to the dev container for debugging ## 0.71.0 (2024-05-17) ### Feat - Update for West Suffolk Councils new website ## 0.70.0 (2024-05-17) ### Feat - #708 Dev Container - Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 Dev Container - #708 simplifying Selenium integration tests - #708 simplifying Selenium integration tests - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Test GH action seenium - #708 Dev Container testing - #708 - dev container changes - #706 Adding Dev Container - #706 Adding initial Dev Container ## 0.69.7 (2024-05-17) ### Fix - #713 BarnsleyMBCouncil.py ## 0.69.6 (2024-05-16) ### Fix - #709 Update DoverDistrictCouncil.py ## 0.69.5 (2024-05-14) ### Fix - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 Small issue and Black formatting - #696 test coverage back to 100% ## 0.69.4 (2024-05-09) ### Fix - pass in required parameter into `create_webdriver` - test runners for `MiltonKeynesCityCouncil` and `NorthEastLincs`. ## 0.69.3 (2024-05-09) ### Fix - fix AttributeError when no garden waste collection is available for properties using Huntingdon District Council - add support for parsing "Today" / "Tomorrow" as date text for `BarnsleyMBCouncil` - add support for parsing "Tomorrow" as date text for `LiverpoolCityCouncil` ## 0.69.1 (2024-05-01) ### Fix - Handling the "Website cookies enhance your user experience." button - Handling the "Website cookies enhance your user experience." button ## 0.69.0 (2024-04-28) ### Feat - Adding Renfrewshire Council - Adding Renfrewshire Council ## 0.68.2 (2024-04-28) ### Fix - Remove 'import Dumper' ## 0.68.1 (2024-04-27) ### Fix - input.json Bradford missing comma ## 0.68.0 (2024-04-27) ### Feat - Add support for West Berkshire Council - add support for Knowsley Metropolitan Borough Council - add support for Cheshire West and Chester Council - add support for Cheshire West and Chester Council ## 0.66.2 (2024-04-18) ### Fix - Update HaringeyCouncil.py issue #670 ## 0.66.1 (2024-04-15) ### Fix - parse datetimes correctly and round to midnight ## 0.66.0 (2024-04-15) ## 0.65.2 (2024-04-15) ### Fix - change address selection to fix errors selecting the user's PAON ## 0.65.1 (2024-04-15) ### Fix - add check for parsed string length to stop datetime parsing error ## 0.65.0 (2024-04-13) ### Feat - add Arun council - add support for Sunderland City Council - add support for Sunderland City Council ## 0.64.3 (2024-03-25) ### Fix - sort data and correct dictionary name (#609) ## 0.64.2 (2024-03-24) ## 0.64.1 (2024-03-24) ### Fix - fix Kirklees address search (switch to house & postcode) - fixes json ## 0.64.0 (2024-03-23) ### Feat - add Kirklees council ### Fix - fixes json ## 0.63.0 (2024-03-23) ### Feat - Add Solihull Council (#513) - Add Adur and Worthing Councils (#544) - Add Dover District Council (#614) - Add Rochford Council (#620) - Add Tandridge District Council (#621) - Add West Northamptonshire Council (#567) - Add Hull City Council (#622) - Add Wyre Council (#625) - Add Telford and Wrekin Co-operative Council (#632) - Add Mansfield District Council (#560) - Add Bedford Borough Council (#552) ### Fix - spacing on input.json - realign input.json - capitalize bin type text - formatting on input.json - incorrect collections - update testing URL for Merton - attempt to resolve invisible banner hiding postcode box - resolve JSON schema exception for date formatting - resolve JSON schema exception for date formatting - accept cookies banner ## 0.62.0 (2024-03-03) ### Fix - Added missing .feature file entry to the test config for NewhamCouncil ## 0.61.1 (2024-02-16) ### Fix - code optimisations - Fix date parsing in WestLindseyDistrictCouncil.py ## 0.61.0 (2024-02-11) ### Feat - Add Mole Valley District Council ## 0.60.1 (2024-02-03) ### Fix - Update input.json Closes #599 ## 0.60.0 (2024-01-28) ### Feat - Add Scraper for St Albans City and District Council ## 0.59.1 (2024-01-25) ### Fix - add wiki note for castlepoint - update test data for castlepoint - remove single line causing issues ## 0.59.0 (2024-01-20) ### Feat - Add NorthYorkshire to test feature file - Add north yorkshire to test input - Add Support for north yorkshire council ### Fix - remove unused code ## 0.58.8 (2024-01-19) ### Fix - barnet no overrides ## 0.58.7 (2024-01-18) ### Fix - accidentally returned strings when needed date objects, refactor to handle this - checking for future/past dates ## 0.58.6 (2024-01-18) ### Fix - correct date handling for North West Leicestershire ## 0.58.5 (2024-01-15) ### Fix - Don't call driver.quit where already handled by finally block ## 0.58.4 (2024-01-15) ### Fix - remove extra driver.quit to prevent errors ## 0.58.3 (2024-01-15) ### Feat - Added support for Newham Council's bin collections ### Fix - Add a default value for user_agent to fix all councils using selenium and not specifying agent ## 0.58.2 (2024-01-11) ### Fix - use static values for bin types ## 0.58.1 (2024-01-10) ### Fix - Eastleigh Borough Council doesnt cope with "You haven't yet signed up for ..." - Eastleigh Borough Council doesnt cope when Garden Waste service hasn't been signed up for, which gets the value "You haven't yet signed up for our garden waste collections. Find out more about our\xa0garden waste collection service" which results in ValueError: time data ## 0.58.0 (2024-01-10) ### Feat - Add Test Valley Borough Council ## 0.57.0 (2024-01-09) ### Feat - Add support for Chorley Council ## 0.56.13 (2024-01-09) ### Fix - update logic to account for council website change ## 0.56.12 (2024-01-09) ### Fix - duplicate driver.quit() calls causes error ## 0.56.11 (2024-01-08) ### Fix - Headless now working on custom comp Update sensor.py ## 0.56.10 (2024-01-08) ### Fix - headless mode in custom component ## 0.56.9 (2024-01-08) ### Fix - headless mode ## 0.56.8 (2024-01-08) ### Fix - headless in custom comp ## 0.56.7 (2024-01-08) ### Fix - headless options ## 0.56.6 (2024-01-07) ### Fix - modified Kingston-upon-Thames driver for greater reliability. ## 0.56.5 (2024-01-07) ### Fix - Update KingstonUponThamesCouncil.py ## 0.56.4 (2024-01-07) ### Fix - Update KingstonUponThamesCouncil.py ## 0.56.3 (2024-01-07) ### Fix - headless options - #542 - Selenium Grid Sessions must be terminated cleanly - #542 - Selenium Grid Sessions must be terminated cleanly ## 0.56.2 (2024-01-07) ### Fix - Update strings.json - Update en.json - Update config_flow.py ## 0.56.1 (2024-01-07) ### Fix - Update common.py ## 0.56.0 (2024-01-07) ### Feat - Update strings.json - Update en.json - Update config_flow.py - adding headless control - adding headless control - adding headless control ## 0.55.3 (2024-01-05) ### Fix - Update lint.yml ## 0.55.2 (2024-01-05) ### Fix - Chelmsford ## 0.55.1 (2024-01-05) ### Fix - Update ChelmsfordCityCouncil.py - Update ChelmsfordCityCouncil.py - Update ChelmsfordCityCouncil.py ## 0.55.0 (2024-01-05) ### Feat - Update codeql-analysis.yml - Update behave.yml - Update CONTRIBUTING.md - Update behave.yml - Update behave.yml - Update ConwyCountyBorough.py - Update behave.yml - Update CheshireEastCouncil.py - Update behave.yml - Update behave.yml - Update behave.yml - Update Makefile - Update Makefile - Update behave.yml - Update Makefile - Update validate_council_outputs.feature ## 0.54.0 (2024-01-04) ### Feat - Barnet seasonal overrides ## 0.53.2 (2024-01-04) ### Fix - barnet (again) ## 0.53.1 (2024-01-04) ### Fix - barnet ## 0.53.0 (2024-01-04) ### Feat - barnet council ## 0.52.0 (2024-01-04) ### Feat - #525 Adding API Server and Docker build - #525 Adding API Server and Docker build ## 0.51.0 (2024-01-04) ### Feat - #522 Adding Nottingham City Council ## 0.50.1 (2024-01-03) ### Fix - don't ask for URL for Vale of White Horse Council ## 0.50.0 (2024-01-03) ### Feat - add Vale of White Horse District Council ### Fix - account for additional string on exceptional schedule ## 0.49.1 (2024-01-01) ### Fix - Torbay ## 0.49.0 (2024-01-01) ### Feat - add South Gloucestershire Council ## 0.48.3 (2024-01-01) ### Fix - manifest.json ## 0.48.2 (2024-01-01) ### Fix - manifest.json to remove depricated attribute ## 0.48.1 (2024-01-01) ### Fix - Hacs Validation Pipeline ## 0.48.0 (2024-01-01) ### Feat - Adding HACS Validation ## 0.47.0 (2024-01-01) ### Feat - Add hassfest validation.yml ## 0.46.1 (2023-12-31) ### Fix - Black formatting - Fix GuildfordCouncil ## 0.46.0 (2023-12-31) ### Feat - Adding Brighton and Hove City Council - Adding Brighton and Hove City Council - Adding Brighton and Hove City Council - Adding Brighton and Hove City Council - Adding London Borough Redbridge - London Borough Redbridge - Adding LondonBoroughRedbridge 431 ### Fix - chelmsford #407 ## 0.45.0 (2023-12-29) ### Feat - Add Haringey Council. ## 0.44.2 (2023-12-29) ### Fix - #509 Wiltshire Update input.json ## 0.44.1 (2023-12-28) ### Fix - Bexley - CharnwoodBoroughCouncil ## 0.44.0 (2023-12-27) ### Feat - Adding support for Gedling Borough Council ## 0.43.0 (2023-12-25) ### Feat - add Newport City Council ## 0.42.1 (2023-12-24) ### Feat - Initial Test Commit for Gedling Borough Council ### Fix - CastlepointDistrictCouncil - 191_fixingbroken_councils - 191_fixingbroken_councils - 191_fixingbroken_councils ## 0.42.0 (2023-12-19) ### Feat - Adding West Lindsey District Council - Adding West Lindsey District Council ## 0.41.5 (2023-12-18) ### Fix - #191 Preston City Council ## 0.41.4 (2023-12-17) ### Fix - #493 Update input.json ## 0.41.3 (2023-12-17) ### Fix - #27 East Riding ## 0.41.2 (2023-12-17) ### Fix - #493 Leeds issues ## 0.41.1 (2023-12-17) ### Fix - Add in URL override for wiki - Update RushmoorCouncil.py to use new URL ## 0.41.0 (2023-12-16) ### Feat - #264 Adding Oldham - #250 Adding Halton Borough Council - #244 Adding Portsmouth City Council ### Fix - #141 Leeds speed up - #174 / #244 / #204 ## 0.40.1 (2023-12-16) ### Fix - 488_blackburnfixes ## 0.40.0 (2023-12-15) ### Feat - adding #204 Forest_of_Dean_District - adding #204 Forest_of_Dean_District ## 0.39.0 (2023-12-13) ### Feat - Adding support for Reading Borough Council ## 0.38.0 (2023-12-12) ### Feat - Add Shropshire Council ## 0.37.2 (2023-12-08) ### Fix - Issue 394 - change coordinator data from numerical indexed list to dictionary ## 0.37.1 (2023-12-08) ### Fix - add postcode and uprn for Bedfordshire Council ## 0.37.0 (2023-12-07) ### Feat - Add BefordshireCouncil scraper ## 0.36.0 (2023-12-07) ### Feat - adding NorthEastDerbyshireDistrictCouncil ## 0.35.1 (2023-12-06) ### Fix - move logging config to collect_data script ## 0.35.0 (2023-12-06) ### Feat - Adding North_West_Leicestershire - Adding North_West_Leicestershire ## 0.34.0 (2023-12-05) ### Feat - Add Sevenoaks District Council - Add Barnsley Metropolitan Borough Council to the feature file - Add Barnsley Metropolitan Borough Council to input.json - Add support for Barnsley Council (#444) - Add Dorset Council to feature file - Add Dorset Council to input.json - Add support for Dorset Council - Add Rugby Borough Council to feature file - Add Rugby Borough Council to input.json - Add parser for Rugby Borough Council (#456) ## 0.32.1 (2023-12-04) ### Fix - Move LiverpoolCityCouncil.py to correct folder ## 0.32.0 (2023-12-01) ### Feat - Add extra files for Stoke-on-Trent support - Add support for Stoke-on-Trent (re: #440) ## 0.31.1 (2023-12-01) ### Fix - change logic to add correct years and support 'Tomorrow' results ## 0.31.0 (2023-12-01) ### Feat - Add support for Environment First collections (re: #433) - Add support for Environment First collections (re: #433) - change parameter name of 'x' to 'step' in get_dates_every_x_days() ## 0.30.1 (2023-12-01) ### Fix - Increase data update timeout for slower selenium based tests ## 0.30.0 (2023-11-30) ### Feat - Added WestSuffolkCouncil ## 0.29.1 (2023-11-29) ### Fix - Fix scraper for Bolton ## 0.29.0 (2023-11-26) ### Feat - Add Mid and East Antrim - Add Mid and East Antrim - Add Mid and East Antrim ## 0.28.1 (2023-11-22) ### Fix - basingstoke adapt to basingstoke site changes ## 0.28.0 (2023-11-08) ### Feat - Add support files for Liverpool City Council - Add additional comments - Add Liverpool City Council parser ### Fix - change dateutil name ## 0.27.2 (2023-11-08) ### Fix - Custom component web driver field label ## 0.27.1 (2023-11-05) ### Fix - 419-fix-selenium-behave-tests ## 0.27.0 (2023-11-04) ### Feat - Update EastSuffolkCouncil.py - Change bin_type's to be title() so it reads better - Driver quit needs to be after last use of driver ## 0.26.0 (2023-11-03) ### Feat - Add remote Selenium web driver support ## 0.25.0 (2023-11-03) ### Feat - Update dev mode & remove JSON outputs - Update dev mode & remove JSON outputs - Update dev mode & remove JSON outputs ## 0.24.3 (2023-11-01) ### Feat - Add remote Selenium web driver support - Add remote Selenium web driver support - Add remote Selenium web driver support - Add remote Selenium web driver support ### Fix - Holidays subdivision error ## 0.24.2 (2023-11-01) ### Fix - #378 update East Northamptionshire to North Northamptonshire ## 0.24.1 (2023-11-01) ### Fix - 410 Adding more behave logging and hamcrest assertations ## 0.24.0 (2023-10-31) ### Feat - Replace individual council schema's with a single common one ## 0.23.2 (2023-10-30) ### Fix - #399 - DeprecationWarning: Python Package holidays ## 0.23.1 (2023-10-30) ### Fix - unit test coverage ## 0.23.0 (2023-10-30) ### Feat - Add support for Conwy council ## 0.22.0 (2023-10-30) ## 0.21.3 (2023-10-29) ### Feat - Add support for Calderdale Council ### Fix - Home Assistant custom component fix for Selenium based councils - Home Assistant custom component fix for Selenium based councils - Fix Chelmsford City Council - Fix input.json order ## 0.21.1 (2023-10-24) ### Fix - Fix the incorrect key collectionTime in json output of Salford Council ## 0.21.0 (2023-10-23) ### Feat - Add support for West Lothian Council - Add support for East Lindsey District Council - Add support for Gateshead Council - Add support for Staffordshire Moorlands District Council ## 0.20.0 (2023-10-20) ### Feat - Add support for Cannock Chase District Council ## 0.19.0 (2023-10-19) ### Feat - fix missing comma in test input for eastsuffolkcouncil ## 0.18.0 (2023-10-19) ### Feat - Add EastSuffolkCouncil support ## 0.17.0 (2023-10-19) ### Feat - Add support for Bury Council (#265) - Add support for Bury Council (#265) ### Fix - correctly align input.json ## 0.16.0 (2023-10-18) ### Feat - Add support for Neath Port Talbot Council ## 0.15.0 (2023-10-18) ### Feat - StratfordUponAvonCouncil Addition ## 0.14.0 (2023-10-18) ### Feat - Rename Chilterns to Buckinghamshire Council ## 0.13.4 (2023-10-16) ### Fix - Update poetry.lock to allow any urllib3 version ## 0.13.3 (2023-10-15) ### Fix - Remove options flow from home assistant custom component ## 0.13.2 (2023-10-15) ### Fix - Update poetry.lock ## 0.13.1 (2023-10-15) ### Fix - Remove first BS4 call to stop page read - fix ValueError and add in correct year data - swap Crawley's USRN for house number - fix date parsing and change BS4 logic ## 0.13.0 (2023-10-11) ### Feat - Add supporting files for Rhondda Cynon Taff Council ## 0.12.1 (2023-09-28) ### Feat - Add support for Reigate and Banstead Borough Council ### Fix - Fix for Wakefield City Council custom component support - Fix for Wakefield City Council custom component support ## 0.11.0 (2023-09-27) ### Feat - Add support for Bath and North East Somerset Council - Add support for multiple instances of the custom component ### Fix - Fix Python Semantic Release version - Fix Wakefield City Council ## 0.10.1 (2023-09-16) ## 0.10.0 (2023-09-16) ## 0.9.0 (2023-07-28) ## 0.8.0 (2023-07-23) ## 0.7.0 (2023-07-23) ## 0.6.0 (2023-07-22) ## 0.5.0 (2023-07-21) ## 0.4.0 (2023-07-20) ## 0.3.0 (2023-07-18) ## 0.2.0 (2023-07-16) ## 0.1.0 (2023-07-16) ================================================ FILE: CODE_OF_CONDUCT.md ================================================ # Contributor Covenant Code of Conduct ## Our Pledge We as members, contributors, and leaders pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation. We pledge to act and interact in ways that contribute to an open, welcoming, diverse, inclusive, and healthy community. ## Our Standards Examples of behavior that contributes to a positive environment for our community include: * Demonstrating empathy and kindness toward other people * Being respectful of differing opinions, viewpoints, and experiences * Giving and gracefully accepting constructive feedback * Accepting responsibility and apologizing to those affected by our mistakes, and learning from the experience * Focusing on what is best not just for us as individuals, but for the overall community Examples of unacceptable behavior include: * The use of sexualized language or imagery, and sexual attention or advances of any kind * Trolling, insulting or derogatory comments, and personal or political attacks * Public or private harassment * Publishing others' private information, such as a physical or email address, without their explicit permission * Other conduct which could reasonably be considered inappropriate in a professional setting ## Enforcement Responsibilities Community leaders are responsible for clarifying and enforcing our standards of acceptable behavior and will take appropriate and fair corrective action in response to any behavior that they deem inappropriate, threatening, offensive, or harmful. Community leaders have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, and will communicate reasons for moderation decisions when appropriate. ## Scope This Code of Conduct applies within all community spaces, and also applies when an individual is officially representing the community in public spaces. Examples of representing our community include using an official e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. ## Enforcement Instances of abusive, harassing, or otherwise unacceptable behavior may be reported to the community leaders responsible for enforcement at . All complaints will be reviewed and investigated promptly and fairly. All community leaders are obligated to respect the privacy and security of the reporter of any incident. ## Enforcement Guidelines Community leaders will follow these Community Impact Guidelines in determining the consequences for any action they deem in violation of this Code of Conduct: ### 1. Correction **Community Impact**: Use of inappropriate language or other behavior deemed unprofessional or unwelcome in the community. **Consequence**: A private, written warning from community leaders, providing clarity around the nature of the violation and an explanation of why the behavior was inappropriate. A public apology may be requested. ### 2. Warning **Community Impact**: A violation through a single incident or series of actions. **Consequence**: A warning with consequences for continued behavior. No interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, for a specified period of time. This includes avoiding interactions in community spaces as well as external channels like social media. Violating these terms may lead to a temporary or permanent ban. ### 3. Temporary Ban **Community Impact**: A serious violation of community standards, including sustained inappropriate behavior. **Consequence**: A temporary ban from any sort of interaction or public communication with the community for a specified period of time. No public or private interaction with the people involved, including unsolicited interaction with those enforcing the Code of Conduct, is allowed during this period. Violating these terms may lead to a permanent ban. ### 4. Permanent Ban **Community Impact**: Demonstrating a pattern of violation of community standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals. **Consequence**: A permanent ban from any sort of public interaction within the community. ## Attribution This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 2.0, available at https://www.contributor-covenant.org/version/2/0/code_of_conduct.html. Community Impact Guidelines were inspired by [Mozilla's code of conduct enforcement ladder](https://github.com/mozilla/diversity). [homepage]: https://www.contributor-covenant.org For answers to common questions about this code of conduct, see the FAQ at https://www.contributor-covenant.org/faq. Translations are available at https://www.contributor-covenant.org/translations. ================================================ FILE: COMPATIBILITY.md ================================================ # Home Assistant Compatibility This document outlines the Home Assistant compatibility testing for the UK Bin Collection custom component. ## Supported Versions The UK Bin Collection custom component is tested against the following Home Assistant versions: - **Minimum supported**: Home Assistant 2023.10.0 - **Recommended**: Latest stable release - **Development**: Latest dev builds (may have issues) ## Automated Testing ### GitHub Workflows 1. **Home Assistant Compatibility Test** (`.github/workflows/ha_compatibility_test.yml`) - Runs on every push to master/main - Tests against multiple HA versions - Validates component imports and manifest - Runs weekly to catch breaking changes 2. **HACS Validation** (`.github/workflows/hacs_validation.yml`) - Includes HassFest validation - HACS action validation - Quick compatibility check ### Manual Testing Run the compatibility checker locally: ```bash # From the project root directory python scripts/check_ha_compatibility.py ``` This script will: - ✅ Validate manifest.json structure - ✅ Test component module imports - ✅ Check Home Assistant version - ✅ Verify dependencies are installed ## Compatibility Matrix | Home Assistant Version | Status | Notes | |------------------------|--------|-------| | 2023.10.x | ✅ Supported | Minimum version | | 2023.12.x | ✅ Supported | Stable | | 2024.1.x | ✅ Supported | Stable | | 2024.3.x | ✅ Supported | Stable | | 2024.6.x | ✅ Supported | Stable | | 2024.9.x | ✅ Supported | Stable | | 2024.12.x | ✅ Supported | Latest stable | | dev | ⚠️ Testing | May have breaking changes | ## Breaking Changes ### Home Assistant 2023.10.0 - Minimum Python version: 3.12 - Updated async patterns required ### Future Considerations - Monitor HA core API changes - Update component when deprecated features are removed - Test against beta releases before stable release ## Troubleshooting ### Common Issues 1. **Import Errors** - Ensure Home Assistant is properly installed - Check Python version compatibility (≥3.12) - Verify uk-bin-collection package is installed 2. **Manifest Validation Failures** - Check manifest.json syntax - Ensure all required fields are present - Verify version numbers match 3. **Component Load Failures** - Check Home Assistant logs - Verify component files are in correct location - Ensure dependencies are satisfied ### Getting Help If you encounter compatibility issues: 1. Check the [GitHub Issues](https://github.com/robbrad/UKBinCollectionData/issues) 2. Run the compatibility checker: `python scripts/check_ha_compatibility.py` 3. Post in the [Home Assistant Community Thread](https://community.home-assistant.io/t/bin-waste-collection/55451) 4. Create a new issue with: - Home Assistant version - Component version - Error logs - Compatibility check output ## For Developers ### Adding New HA Version Tests 1. Update `.github/workflows/ha_compatibility_test.yml` 2. Add new version to the matrix 3. Test locally first: `python scripts/check_ha_compatibility.py` 4. Update compatibility matrix in this document ### Testing Locally ```bash # Install specific HA version pip install homeassistant==2024.12.0 # Install component in development mode pip install -e . # Run compatibility check python scripts/check_ha_compatibility.py # Run component tests python -m pytest custom_components/uk_bin_collection/tests/ ``` ================================================ FILE: CONTRIBUTING.md ================================================ # Contents - [Contents](#contents) - [Contributor guidelines](#contributor-guidelines) - [Getting Started](#getting-started) - [Environment Setup](#environment-setup) - [Project Aims](#project-aims) - [What can I contribute to?](#what-can-i-contribute-to) - [Claiming an issue](#claiming-an-issue) - [Pushing your changes](#pushing-your-changes) - [Adding a scraper](#adding-a-scraper) - [Developing](#developing) - [Developing using our Dev Container](#developing-using-our-dev-container) - [Prerequisites](#prerequisites) - [Step 1: Clone the Repository](#step-1-clone-the-repository) - [Step 2: Set Up Docker](#step-2-set-up-docker) - [Step 3: Open the Project in VSCode](#step-3-open-the-project-in-vscode) - [Step 4: Reopen in Container](#step-4-reopen-in-container) - [Step 5: Verify the Development Environment](#step-5-verify-the-development-environment) - [Developing](#developing-1) - [Kwargs](#kwargs) - [Common Functions](#common-functions) - [Additional files](#additional-files) - [Input JSON file](#input-json-file) - [Testing](#testing) - [Behave (Integration Testing)](#behave-integration-testing) - [Running the Behave tests for all councils](#running-the-behave-tests-for-all-councils) - [Running the Behave tests for a specific council](#running-the-behave-tests-for-a-specific-council) - [GitHub Actions Integration Tests](#github-actions-integration-tests) - [Test Results](#test-results) - [Allure Report](#allure-report) - [CodeCov Report](#codecov-report) - [Pytest (Unit Testing)](#pytest-unit-testing) - [Running the Unittests](#running-the-unittests) - [Contact info](#contact-info) # Contributor guidelines This document contains guidelines on contributing to the UKBCD project including how the project works, how to set up the environment, how we use our issue tracker, and how you can develop more scrapers. ## Getting Started You will need to install Python on the system you plan to run the script from. Python 3.12 is tested on this project . The project uses [poetry](https://python-poetry.org/docs/) to manage dependencies and setup the build environment. ### Environment Setup ``` pip install poetry # Clone the Repo git clone https://github.com/robbrad/UKBinCollectionData cd UKBinCollectionData # Install Dependencies poetry install poetry shell ``` ## Project Aims - To provide a real-world environment to learn Python and/or web scraping - To provide UK bin data in a standardised format for use (albeit not exclusively) with [HomeAssistant](https://www.home-assistant.io/) ### What can I contribute to? - The majority of project work comes from developing new scrapers for requested councils. These can be found on the [issue tracker](https://github.com/robbrad/UKBinCollectionData/labels/council%20request) with `council request` labels. - Tasks that require [additional input](https://github.com/robbrad/UKBinCollectionData/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22) have the `help wanted` label - these can be trickier requests or may have many smaller tasks. - [Easier tasks](https://github.com/robbrad/UKBinCollectionData/labels/good%20first%20issue), that would be a good fit for people new to the project or the world of web scraping are labelled with the `good first issue` label ## Claiming an issue If there is an existing issue you wish to work on, please do the following things: - Assign the issue to yourself (or ask someone to assign you) - that way, others know you're working on it - Create a new branch - its recommended to use the 'create a branch' option on the issue page, create it in your forked repo and then checkout the branch locally (or in your IDE). **NB:** Exploratory work doesn't require claiming an issue - you only need to claim if you plan on developing the full scraper and associated files. If you just want to explore an issue, feel free to do so - and also feel free to post anything helpful in the issue comments. ## Pushing your changes There are guides below on how to add a scraper to the project, along with what files are needed and what tests should be run. When the time comes to push your changes, please be aware that we use [conventional commits](https://www.conventionalcommits.org/en/v1.0.0/#summary) to provide a clear summary of what a change does. This means that commit messages should start with one of the following: - `feat:` for a new feature (including a new scraper) - `fix:` for when a bug is fixed or an issue is resolved - `docs:` for when changes to documentations are made Don't worry if you forget - commit messages are automatically checked when you open a merge request by a lint checker, and can easily be rectified by recommitting or pushing again with the correct prefix. # Adding a scraper This project uses a design pattern called the [Template Method](https://refactoring.guru/design-patterns/template-method) which basically allows for a structured class that can be extended. In our case, the getting of the data from the council and the presentation of the JSON remains the same via the [abstract class](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/uk_bin_collection/get_bin_data.py#L21) - however the scraping of each council is different and this allows us to have a class for each [council](https://github.com/robbrad/UKBinCollectionData/tree/master/uk_bin_collection/uk_bin_collection/councils) - you can see this in action [here](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/uk_bin_collection/councils/CheshireEastCouncil.py#L5,L16). There are a few different options for scraping, and you are free to choose whichever best suits the council: - Using [Beautiful Soup 4](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/uk_bin_collection/councils/CheshireEastCouncil.py) - Using the [requests](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/uk_bin_collection/councils/ManchesterCityCouncil.py) module - Reading data from [external files](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/uk_bin_collection/councils/LeedsCityCouncil.py) - Using [Selenium](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/uk_bin_collection/councils/Chilterns.py) to automate browser behaviour ## Developing To get started, first you will need to fork this repository and setup your own working environment before you can start developing. ### Developing using our Dev Container You need to set up Docker, Visual Studio Code (VSCode), and a development container (devcontainer) after cloning the repository at https://github.com/robbrad/UKBinCollectionData. #### Prerequisites Before you start, make sure you have the following installed on your computer: - Docker: [Download Docker](https://www.docker.com/products/docker-desktop) - Visual Studio Code (VSCode): [Download VSCode](https://code.visualstudio.com/download) - Remote - Containers extension for VSCode: Install it from the VSCode Marketplace or directly from the Extensions view (`Ctrl+Shift+X` in VSCode and search for "Remote - Containers"). #### Step 1: Clone the Repository First, clone the repository to your local machine. Open a terminal and run the following command: ```bash git clone https://github.com/robbrad/UKBinCollectionData.git ``` Navigate into the directory: ```bash cd UKBinCollectionData ``` #### Step 2: Set Up Docker Ensure Docker is running on your system. You can verify this by running: ```bash docker -v ``` This should return the version of Docker installed. If Docker is running, you’ll see no errors. #### Step 3: Open the Project in VSCode Open VSCode, and then open the cloned repository by going to `File > Open Folder...` and selecting the `UKBinCollectionData` folder. #### Step 4: Reopen in Container Once the folder is open in VSCode: 1. A prompt might appear asking you to reopen in a container. If it does, select "Reopen in Container". 2. If you don’t see the prompt, press `F1` to open the command palette, type "Remote-Containers: Reopen in Container", and select that option. VSCode will start building the Docker container as defined in the `.devcontainer/` folder in the repository. This process can take a few minutes as it involves downloading the base Docker and Selenium hub images and setting up the environment. #### Step 5: Verify the Development Environment Once the container is set up, VSCode will connect to it automatically. You can start editing and running the code inside the container. This ensures that your development environment is consistent and controlled, replicating the same settings and tools as specified in the devcontainer configuration. ### Developing Once your environment is ready, create a new branch from your master/main branch Then you can run ``` poetry run python uk_bin_collection/uk_bin_collection/create_new_council.py "CouncilName" "CouncilURL" ``` The new .py file will be used in the CLI to call the parser, so be sure to pick a sensible name - e.g. CheshireEastCouncil.py is called with: ``` python collect_data.py CheshireEastCouncil ``` To simplify things somewhat, a [template](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/uk_bin_collection/councils/council_class_template/councilclasstemplate.py) file has been created - open this file, copy the contents to your new .py file and start from there. The create script above will create 1. A council Class file under the councils folder 2. Make an entry in input.json You are pretty much free to approach the scraping however you would like, but please ensure that: - Your scraper returns a dictionary made up of the key "bins" and a value that is a list of bin types and collection dates. An example of this can be seen below. - Any dates or times are formatted to standard UK formats (see [below](#common-functions))
Output Example ```json { "bins": [ { "type": "Empty Standard Mixed Recycling", "collectionDate": "29/07/2022" }, { "type": "Empty Standard Garden Waste", "collectionDate": "29/07/2022" }, { "type": "Empty Standard General Waste", "collectionDate": "05/08/2022" } ] } ```
### Kwargs UKBCD has two mandatory parameters when it runs - the name of the parser (sans .py) and the URL from which to scrape. However, developers can also get the following data using `kwargs`: | Parameter | Prompt | Notes | kwargs.get | |-----------------------------------------|--------------------------|-------------------------------------------------------------|------------------------------| | UPRN (Unique Property Reference Number) | `-u` or `--uprn` | | `kwargs.get('uprn')` | | USRN (Unique Street Reference Number) | `-us` or `--usrn` | | `kwargs.get('usrn')` | | House number | `-n` or `--number` | Sometimes called PAON | `kwargs.get('paon')` | | Postcode | `-p` or `--postcode` | Needs to be wrapped in quotes on the CLI | `kwargs.get('postcode')` | | Skip Get URL | `-s` or `--skip_get_url` | | `kwargs.get('skip_get_url')` | | URL for remote Selenium web driver | `-w` or `--web_driver` | Needs to be wrapped in quotes on the CLI | `kwargs.get('web_driver')` | | Development Mode | `-d` or `--dev_mode` | Create/update council's entry in the input.json on each run | `kwargs.get('dev_mode')` | These parameters are useful if you're using something like the requests module and need to take additional user information into the request, such as: ```commandline python collect_data.py LeedsCityCouncil https://www.leeds.gov.uk/residents/bins-and-recycling/check-your-bin-day -p "LS1 2JG" -n 41 ``` In the scraper, the following code takes the inputted parameters and uses them in two different variables: ```python user_postcode = kwargs.get("postcode") user_paon = kwargs.get("paon") ``` Each parameter also has its own validation method that should be called after the `kwargs.get`: - `check_uprn()` - `check_paon()` - `check_postcode()` The first two are simple validators - if the parameter is used but no value is given, they will throw an exception. `check_postcode()` works differently - instead making a call to the [postcodes.io](https://postcodes.io/) API to check if it exists or not. An exception will only be thrown here if the response code is not `HTTP 200`. ### Common Functions The project has a small but growing library of functions (and the occasional variable) that are useful when scraping websites or calendars - aptly named [common.py](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/uk_bin_collection/common.py). Useful functions include: - functions to [add ordinals](https://github.com/robbrad/UKBinCollectionData/blob/e49da2f43143ac7c65fbeaf35b5e86b3ea19e31b/uk_bin_collection/uk_bin_collection/common.py#L72) to dates (04 becomes 4th) or [remove them](https://github.com/robbrad/UKBinCollectionData/blob/e49da2f43143ac7c65fbeaf35b5e86b3ea19e31b/uk_bin_collection/uk_bin_collection/common.py#L86) (4th becomes 04) - a function to check [if a date is a holiday](https://github.com/robbrad/UKBinCollectionData/blob/e49da2f43143ac7c65fbeaf35b5e86b3ea19e31b/uk_bin_collection/uk_bin_collection/common.py#L117) in a given part of the UK - a function that returns the [dates of a given weekday](https://github.com/robbrad/UKBinCollectionData/blob/e49da2f43143ac7c65fbeaf35b5e86b3ea19e31b/uk_bin_collection/uk_bin_collection/common.py#L136) in N amounts of weeks - a function that returns a [list of dates every N days](https://github.com/robbrad/UKBinCollectionData/blob/e49da2f43143ac7c65fbeaf35b5e86b3ea19e31b/uk_bin_collection/uk_bin_collection/common.py#L148) from a given start date - a function to check [if a string contains a date](./uk_bin_collection/uk_bin_collection/common.py#L249) (leverages [dateutil's parser](https://dateutil.readthedocs.io/en/stable/parser.html)) `common.py` also contains a [standardised date format](https://github.com/robbrad/UKBinCollectionData/blob/e49da2f43143ac7c65fbeaf35b5e86b3ea19e31b/uk_bin_collection/uk_bin_collection/common.py#L11) variable called `date_format`, which is useful to call when formatting datetimes. Please feel free to contribute to this library as you see fit - added functions should include the following: - clear, lowercase and underscored name - parameter types - a return type (if there is one) - a docustring describing what the function does, as well as parameter and return type descriptors. ## Additional files In order for your scraper to work with the project's testing suite, some additional files need to be provided or modified: - [ ] [Input JSON file](#input-json-file) **Note:** from here on, anything containing`` should be replaced with the scraper's name. ### Input JSON file | Type | File location | |--------|----------------------------------------------------------| | Modify | `UKBinCollectionData/uk_bin_collection/tests/input.json` | Each council should have a node that matches the scraper's name. The node should include arguments in curly braces - the URL is mandatory, but any additional parameters like UPRN or postcode should also be provided. Councils should be listed in alphabetical order. A "wiki_name" argument with the council's full name should also be provided. A "wiki_note" argument should be used where non-standard instructions of just providing UPRN/Postcode/House Number parameters are needed. A "wiki_command_url_override" argument should be used where parts of the URL need to be replaced by the user to allow a valid URL to be left for the integration tests. A new [Wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) entry will be generated automatically from this file's details. **Note:** If you want the integration test to work you must supply real, working data (a business address is recommended - the council's address is usually a good one).
Example ```json "CheshireEastCouncil": { "uprn": "100012791226", "url": "https://online.cheshireeast.gov.uk/MyCollectionDay/SearchByAjax/GetBartecJobList?uprn=100012791226&onelineaddress=3%20COBBLERS%20YARD,%20SK9%207DZ&_=1621149987573", "wiki_name": "Cheshire East Council", "wiki_command_url_override": "https://online.cheshireeast.gov.uk/MyCollectionDay/SearchByAjax/GetBartecJobList?uprn=XXXXXXXX&onelineaddress=XXXXXXXX&_=1621149987573", "wiki_note": "Both the UPRN and a one-line address are passed in the URL, which needs to be wrapped in double quotes. The one-line address is made up of the house number, street name and postcode.\nUse the form [here](https://online.cheshireeast.gov.uk/mycollectionday/) to find them, then take the first line and post code and replace all spaces with `%20`." }, ```
## Testing ### Behave (Integration Testing) As with any web scraping project, there's a reliance on the council not changing their website - if this happens Beautiful Soup will fail to read the site correctly, and the expected data will not be returned. To mitigate this and stay on top of "what works and what needs work" - we have created a set of Integration tests which run a [feature](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/tests/features/validate_council_outputs.feature) file. Based on the [input.json](https://github.com/robbrad/UKBinCollectionData/blob/master/uk_bin_collection/tests/input.json), this does an actual live run against the council's site and validates if the returned data is JSON and conforms to the common format [JSON Schema](https://github.com/robbrad/UKBinCollectionData/tree/master/uk_bin_collection/tests/output.schema). By default if the council is a Selenium based council it will run in headless mode. If you pass `--headless=False` to pytest (possible in VS Code launch.json useful for debugging code) It will run in a visable browser. It also defaults the Selenium URL to be `http://localhost:4444` and the local_browser to False You can set pytest to test on your local web browser without Selenium Grid by setting `--local_browser=True` If you want a different Selenium URL you can set it with `--selenium_url=http://selenium:4444` NOTE: you can't set `--local_browser=True` (defaults: False) as Selenium testing will be ignored In VSCode if you set a make a launch.json you can debug the test locally with the following setup ```json { "version": "0.2.0", "configurations": [ { "name": "Python Debugger: Current File", "type": "debugpy", "request": "launch", "purpose": ["debug-test"], "env": { "PYTEST_ADDOPTS": "--headless=False --local_browser=True" } } ] } ``` It is also possible to run ```commandline #Visable Selenium Run in Local Broswer poetry run pytest uk_bin_collection/tests/step_defs/ -k "Council_Name" --headless=False --local_browser=True #Visable Selenium Run in on Selenium Grid poetry run pytest uk_bin_collection/tests/step_defs/ -k "Council_Name" --headless=False --selenium_url=http://localhost:4444 ``` #### Running the Behave tests for all councils ```commandline cd UKBinCollectionData poetry shell poetry run pytest uk_bin_collection/tests/step_defs/ -n logical ``` #### Running the Behave tests for a specific council ```commandline cd UKBinCollectionData poetry shell poetry run pytest uk_bin_collection/tests/step_defs/ -n logical -k "BarnetCouncil" ``` #### GitHub Actions Integration Tests The [GitHub actions](https://github.com/robbrad/UKBinCollectionData/actions/workflows/behave.yml) is set to run on push and pull_requests It uses a [Makefile](https://github.com/robbrad/UKBinCollectionData/blob/master/Makefile) to run the [Behave](#behave--integration-testing-) tests to ensure the councils are all still working #### Test Results ##### Allure Report The Github Actions publishes the Allure Behave Test results to Github Pages: https://robbrad.github.io/UKBinCollectionData// eg https://robbrad.github.io/UKBinCollectionData/3.9/ you can check this to see if a council is still working as expected ##### CodeCov Report The CodeCov.io report can be found [here](https://app.codecov.io/gh/robbrad/UKBinCollectionData) ### Pytest (Unit Testing) As well as integration testing the repo is setup to test some of the static methods as well to ensure basic core functionality #### Running the Unittests ```commandline cd UKBinCollectionData poetry shell poetry run coverage run --omit "*/tests/*" -m pytest uk_bin_collection/tests --ignore=uk_bin_collection/tests/step_defs/ poetry run coverage xml ``` # Contact info If you have questions or comments, you can reach the project contributors in the following ways: - Council requests can be submitted [here](https://github.com/robbrad/UKBinCollectionData/issues/new?assignees=&labels=Class%3A+enhancement&template=COUNCIL_REQUEST.yaml) - General questions or comments can be submitted [here](https://github.com/robbrad/UKBinCollectionData/discussions/categories/q-a) ================================================ FILE: LICENSE ================================================ MIT License Copyright (c) 2022 Robert Bradley Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ================================================ FILE: Makefile ================================================ .PHONY: install pre-build build black pycodestyle update-wiki ## @CI_actions Installs the checked out version of the code to your poetry managed venv install: poetry install --without dev install-dev: poetry install ## @CI_actions Runs code quality checks pre-build: black unit-tests rm setup.py || echo "There was no setup.py" poetry show --no-dev | awk '{print "poetry add "$$1"=="$$2}' | sort | sh ## @CI_actions Builds the project into an sdist build: poetry build -f sdist ## @Code_quality Runs black on the checked out code black: poetry run black **/*.py ## @Code_quality Runs pycodestyle on the the checked out code pycodestyle: poetry run pycodestyle --statistics -qq uk_bin_collection ## @Testing runs unit tests integration-tests: # Ensure directory exists mkdir -p build/$(matrix)/integration-test-results # Turn off "exit on error" so we can capture the code set +e; \ if [ -z "$(councils)" ]; then \ poetry run pytest uk_bin_collection/tests/step_defs/ \ -n logical \ --junit-xml=build/$(matrix)/integration-test-results/junit.xml; \ else \ poetry run pytest uk_bin_collection/tests/step_defs/ \ -k "$(councils)" \ -n logical \ --junit-xml=build/$(matrix)/integration-test-results/junit.xml; \ fi; \ RESULT=$$?; \ set -e; \ # Double-check that the file exists (in case of a really early crash) if [ ! -f build/$(matrix)/integration-test-results/junit.xml ]; then \ echo "" \ > build/$(matrix)/integration-test-results/junit.xml; \ fi; \ exit $$RESULT generate-test-map-test-results: poetry run python uk_bin_collection/tests/generate_map_test_results.py build/integration-test-results/junit.xml > build/integration-test-results/test_results.json parity-check: poetry run python uk_bin_collection/tests/council_feature_input_parity.py "$(repo)" "$(branch)" unit-tests: poetry run coverage erase - poetry run coverage run --append --omit "*/tests/*" -m pytest -vv -s --log-cli-level=DEBUG uk_bin_collection/tests custom_components/uk_bin_collection/tests --ignore=uk_bin_collection/tests/step_defs/ poetry run coverage xml update-wiki: poetry run python wiki/generate_wiki.py ================================================ FILE: README.md ================================================ [![Made with Python](https://img.shields.io/badge/Made%20With%20Python-red?style=for-the-badge&logo=python&logoColor=white&labelColor=red)](https://www.python.org) [![HACS Badge](https://img.shields.io/badge/HACS-Custom-41BDF5.svg?style=for-the-badge)](https://github.com/robbrad/UKBinCollectionData) [![Current Release](https://img.shields.io/github/v/release/robbrad/UKBinCollectionData?style=for-the-badge&filter=*)](https://github.com/robbrad/UKBinCollectionData/releases) [![PyPi](https://img.shields.io/pypi/v/uk_bin_collection?label=PyPI&logo=pypi&style=for-the-badge&color=blue)](https://pypi.org/project/uk-bin-collection/) [![GitHub license](https://img.shields.io/github/license/robbrad/UKBinCollectionData?style=for-the-badge)](https://github.com/robbrad/UKBinCollectionData/blob/master/LICENSE) [![GitHub issues](https://img.shields.io/github/issues-raw/robbrad/UKBinCollectionData?style=for-the-badge)](https://github.com/robbrad/UKBinCollectionData/issues?q=is%3Aopen+is%3Aissue) [![GitHub closed issues](https://img.shields.io/github/issues-closed-raw/robbrad/UKBinCollectionData?style=for-the-badge)](https://github.com/robbrad/UKBinCollectionData/issues?q=is%3Aissue+is%3Aclosed) [![GitHub contributors](https://img.shields.io/github/contributors/robbrad/UKBinCollectionData?style=for-the-badge)](https://github.com/robbrad/UKBinCollectionData/graphs/contributors) [![Test Councils](https://img.shields.io/github/actions/workflow/status/robbrad/UKBinCollectionData/behave.yml?style=for-the-badge&label=Test+Councils)](https://github.com/robbrad/UKBinCollectionData/actions/workflows/behave.yml) ![Codecov](https://img.shields.io/codecov/c/gh/robbrad/UKBinCollectionData?style=for-the-badge) [![CodeQL Analysis](https://img.shields.io/github/actions/workflow/status/robbrad/UKBinCollectionData/codeql-analysis.yml?style=for-the-badge&label=CodeQL+Analysis)](https://github.com/robbrad/UKBinCollectionData/actions/workflows/codeql-analysis.yml) [![Publish Release](https://img.shields.io/github/actions/workflow/status/robbrad/UKBinCollectionData/release.yml?style=for-the-badge&label=Publish+Release)](https://github.com/robbrad/UKBinCollectionData/actions/workflows/release.yml) [![Test Report Deployment](https://img.shields.io/github/actions/workflow/status/robbrad/UKBinCollectionData/pages%2Fpages-build-deployment?style=for-the-badge&label=Test+Report+Deployment)](https://github.com/robbrad/UKBinCollectionData/actions/workflows/pages/pages-build-deployment) # UK Bin Collection Data (UKBCD) This project aims to provide a neat and standard way of providing bin collection data in JSON format from UK councils that have no API to do so. Why would you want to do this? You might want to use this in Home Automation—for example, say you had an LED bar that lit up on the day of bin collection to the colour of the bin you want to take out; then this repo provides the data for that. **PLEASE respect a councils' infrastructure / usage policy and only collect data for your own personal use on a suitable frequency to your collection schedule.** Most scripts make use of [Beautiful Soup 4](https://pypi.org/project/beautifulsoup4/) to scrape data, although others use different approaches, such as emulating web browser behaviour, or reading data from CSV files. [![](https://img.shields.io/badge/-41BDF5?style=for-the-badge&logo=homeassistant&logoColor=white&label=HomeAssistant+Thread)](https://community.home-assistant.io/t/bin-waste-collection/55451) [![](https://img.shields.io/badge/Request%20a%20council-gray?style=for-the-badge&logo=github&logoColor=white)](https://github.com/robbrad/UKBinCollectionData/issues/new/choose) --- ## Requesting your council > :warning: Please check that a request for your council has not already been made. You can do this by searching on the [Issues](https://github.com/robbrad/UKBinCollectionData/issues) page. If an issue already exists, please comment on that issue to express your interest. Please do not open a new issue, as it will be closed as a duplicate. If an issue does not already exist, please fill in a new [Council Request](https://github.com/robbrad/UKBinCollectionData/issues/new/choose) form, including as much information as possible, including: - Name of the council. - URL to bin collections. - An example postcode and/or [UPRN](https://uprn.uk/) (whichever is relevant). - Any further information. Please be aware that this project is run by volunteer contributors and completion depends on numerous factors - even with a request, we cannot guarantee if/when your council is added to this integration. --- ## Home Assistant Usage ### Install with HACS (recommended) #### Automated [![hacs_badge](https://img.shields.io/badge/HACS-Default-41BDF5.svg?style=for-the-badge)](https://github.com/hacs/integration) This integration can be installed directly via HACS. To install: * [Add the repository](https://my.home-assistant.io/redirect/hacs_repository/?owner=robbrad&repository=UKBinCollectionData&category=integration) to your HACS installation * Click `Download` For details on how to setup the custom component integration, see the [documentation](https://github.com/robbrad/UKBinCollectionData/tree/master/custom_components/uk_bin_collection). #### Manual 1. Ensure you have [HACS](https://hacs.xyz/) installed 1. In the Home Assistant UI go to `HACS` > `Integrations` > `⋮` > `Custom repositories`. 1. Enter `https://github.com/robbrad/UKBinCollectionData` in the `Repository` field. 1. Select `Integration` as the category then click `ADD`. 1. Click `+ Add Integration` and search for and select `UK Bin Collection Data` then click `Download`. 1. Restart your Home Assistant. 1. In the Home Assistant UI go to `Settings` > `Devices & Services` click `+ Add Integration` and search for `UK Bin Collection Data`. 1. If you see a "URL of the remote Selenium web driver to use" field when setting up your council, you'll need to provide the URL to a web driver you've set up separately such as [standalone-chrome](https://hub.docker.com/r/selenium/standalone-chrome). ### Install manually 1. Open the folder for your Home Assistant configuration (where you find `configuration.yaml`). 1. If you do not have a `custom_components` folder there, you need to create it. 1. [Download](https://github.com/robbrad/UKBinCollectionData/archive/refs/heads/master.zip) this repository then copy the folder `custom_components/uk_bin_collection` into the `custom_components` folder you found/created in the previous step. 1. Restart your Home Assistant. 1. In the Home Assistant UI go to `Settings` > `Devices & Services` click `+ Add Integration` and search for `UK Bin Collection Data`. ### Overriding the Bin Icon and Bin Colour We realise it is difficult to set a colour from the councils text for the Bin Type and to keep the integration generic, we don't capture colour from a council (not all councils supply this as a field), only bin type and next collection date. When you configure the component on the first screen, you can set a JSON string to map the bin type to the colour and icon Here is an example to set the colour and icon for the type `Empty Standard General Waste`. This type is the type returned from the council for the bin. You can do this for multiple bins. If you miss this on the first setup, you can reconfigure it. ```json { "Empty Standard General Waste": { "icon": "mdi:trash-can", "color": "blue" } } ``` --- ## Standalone Usage ```commandline PS G:\Projects\Python\UKBinCollectionData\uk_bin_collection\collect_data.py usage: collect_data.py [-h] [-p POSTCODE] [-n NUMBER] [-u UPRN] module URL positional arguments: module Name of council module to use (required) URL URL to parse (required) options: -h, --help show this help message (optional) -p POSTCODE, --postcode POSTCODE Postcode to parse - should include (optional) a space and be wrapped in double quotes -n NUMBER, --number NUMBER House number to parse (optional) -u UPRN, --uprn UPRN UPRN to parse (optional) ``` ### Quickstart The basic command to execute a script is: ```commandline python collect_data.py "" ``` where ```council_name``` is the name of the council's .py script (without the .py) and ```collection_url``` is the URL to scrape. The help documentation refers to these as "module" and "URL", respectively. Supported council scripts can be found in the `uk_bin_collection/uk_bin_collection/councils` folder. Some scripts require additional parameters, for example, when a UPRN is not passed in a URL, or when the script is not scraping a web page. For example, the Leeds City Council script needs two additional parameters—a postcode, and a house number. This is done like so: ```commandline python collect_data.py LeedsCityCouncil https://www.leeds.gov.uk/residents/bins-and-recycling/check-your-bin-day -p "LS1 2JG" -n 41 ``` - A **postcode** can be passed with `-p "postcode"` or `--postcode "postcode"`. The postcode must always include a space in the middle and be wrapped in double quotes (due to how command line arguments are handled). - A **house number** can be passed with `-n number` or `--number number`. - A **UPRN reference** can be passed with `-u uprn` or `--uprn uprn`. To check the parameters needed for your council's script, please check the [project wiki](https://github.com/robbrad/UKBinCollectionData/wiki) for more information. ### Project dependencies Some scripts rely on external packages to function. A list of required scripts for both development and execution can be found in the project's [PROJECT_TOML](https://github.com/robbrad/UKBinCollectionData/blob/feature/%2353_integration_tests/pyproject.toml). Install can be done via `poetry install` from within the root of the repo. --- ## UPRN Finder Some councils make use of the UPRN (Unique property reference number) to identify your property. You can find yours [here](https://www.findmyaddress.co.uk/search) or [here](https://uprn.uk/). --- ## Selenium Some councils need Selenium to run the scrape on behalf of Home Assistant. The easiest way to do this is run Selenium as in a Docker container. However that you do this, the Home Assistant server must be able to reach the Selenium server. ### Instructions for Windows, Linux, and Mac #### Step 1: Install Docker ##### Windows 1. **Download Docker Desktop for Windows:** * Go to the Docker website: Docker Desktop for Windows. * Download and install Docker Desktop. 2. **Run Docker Desktop:** * After installation, run Docker Desktop. * Follow the on-screen instructions to complete the setup. * Ensure Docker is running by checking the Docker icon in the system tray. ##### Linux 1. **Install Docker:** * Open a terminal and run the following commands: ```bash sudo apt-get update sudo apt-get install \ apt-transport-https \ ca-certificates \ curl \ gnupg \ lsb-release curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo gpg --dearmor -o /usr/share/keyrings/docker-archive-keyring.gpg echo \ "deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-archive-keyring.gpg] https://download.docker.com/linux/ubuntu \ $(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null sudo apt-get update sudo apt-get install docker-ce docker-ce-cli containerd.io ``` 2. **Start Docker:** * Run the following command to start Docker: ```bash sudo systemctl start docker ``` 3. **Enable Docker to start on boot:** bash Copy code ```bash sudo systemctl enable docker ``` ##### Mac 1. **Download Docker Desktop for Mac:** * Go to the Docker website: Docker Desktop for Mac. * Download and install Docker Desktop. 2. **Run Docker Desktop:** * After installation, run Docker Desktop. * Follow the on-screen instructions to complete the setup. * Ensure Docker is running by checking the Docker icon in the menu bar. #### Step 2: Pull and Run Selenium Standalone Chrome Docker Image 1. **Open a terminal or command prompt:** 2. **Pull the Selenium Standalone Chrome image:** ```bash docker pull selenium/standalone-chrome ``` 3. **Run the Selenium Standalone Chrome container:** ```bash docker run -d -p 4444:4444 --name selenium-chrome selenium/standalone-chrome ``` #### Step 3: Test the Selenium Server 1. **Navigate to the Selenium server URL in your web browser:** * Open a web browser and go to `http://localhost:4444`. * You should see the Selenium Grid console. #### Step 4: Supply the Selenium Server URL to UKBinCollectionData 1. **Find the `UKBinCollectionData` project:** * Go to the GitHub repository: [UKBinCollectionData](https://github.com/robbrad/UKBinCollectionData). 2. **Supply the Selenium Server URL:** * Typically, the URL will be `http://localhost:4444/wd/hub`. * You might need to update a configuration file or environment variable in the project to use this URL. Check the project's documentation for specific instructions. ### Summary of Commands **Windows/Linux/Mac:** ```bash docker pull selenium/standalone-chrome docker run -d -p 4444:4444 --name selenium-chrome selenium/standalone-chrome ``` **Selenium Server URL:** * `http://localhost:4444/wd/hub` --- ### Instructions for Home Assistant OS If you're running Home Assistant Supervised, it's possible to host the Selenium instance on the same system. This guide is based on a Raspberry Pi 4. Instructions for other systems may vary. #### Prerequisites 1. Install **Portainer** from Alex Belgium's add-on repository: [alexbelgium/hassio-addons](https://github.com/alexbelgium/hassio-addons) --- #### Step 1: Pull and Run Docker Image Since the Raspberry Pi 4 uses an ARM64-based architecture, use the `seleniarm/standalone-chromium:latest` Docker image. 1. Open **Portainer** and navigate to the **Images** tab. 2. In the **Image** text box, enter: ``` seleniarm/standalone-chromium:latest ``` 3. Click **Pull the image**. 4. Once the image is pulled, navigate to the **Containers** tab and click **Add container**. 5. Configure the container: - **Name:** Give it a clear and descriptive name (e.g., `selenium-chromium`). - **Image:** Enter: ``` seleniarm/standalone-chromium ``` Make sure to uncheck **Always pull the image**. - **Network ports configuration:** - Click **Map additional port**. - Set both the **Host** and **Container** ports to `4444`. 6. Click **Deploy the container**. --- #### Step 2: Configure UKBinCollectionData Integration 1. **Add the integration** in Home Assistant. 2. On the second stage of the integration setup wizard: - Ensure that `http://localhost:4444` shows as accessible. - If not, verify that the Selenium container is running in Portainer. 3. Enter the required information for the integration. 4. In the **Remote Selenium Server** text box, enter: ``` http://:4444 ``` Replace `` with the IP address of your Home Assistant system. --- ## Reports All integration tests results are in [CodeCov](https://app.codecov.io/gh/robbrad/UKBinCollectionData/) ### Nightly Full Integration Test Reports: - [Nightly Council Test](https://app.codecov.io/gh/robbrad/UKBinCollectionData/tests/master) 🗺️ View Test Coverage Map (in VS Code) --------------------------------------- You can generate integration test results and view the interactive UK council coverage map with traffic-light-style statuses for each council. ### 🧪 Step 1: Run Integration Tests Run: `make integration-tests` This runs the full BDD test suite and outputs a `junit.xml` report to: `build/test/integration-test-results/junit.xml` ### 📊 Step 2: Generate Map Test Results JSON Convert the JUnit XML output to a flat test result JSON: `make generate-test-map-test-results` This creates: `build/integration-test-results/test_results.json` This file is used by the map to color each council: * ✅ Green: Test passed * 🟠 Amber: Test failed * ❌ Red: Not integrated ### 🗺️ Step 3: Open the Map Open the map viewer in VS Code: 1. Right-click the `map.html` file in VSCode and choose **Show Preview** 2. The map will open in your browser, showing real-time integration coverage and test results. ![Test Results Map](test_results_map.png) --- ## ICS Calendar Generation You can convert bin collection data to an ICS calendar file that can be imported into calendar applications like Google Calendar, Apple Calendar, Microsoft Outlook, etc. ### Overview The `bin_to_ics.py` script allows you to: - Convert JSON output from bin collection data into ICS calendar events - Group multiple bin collections on the same day into a single event - Create all-day events (default) or timed events - Add optional reminders/alarms to events - Customize the calendar name ### Requirements - Python 3.6 or higher - The `icalendar` package, which can be installed with: ```bash pip install icalendar ``` ### Basic Usage ```bash # Basic usage with stdin input and default output file (bin.ics) python bin_to_ics.py < bin_data.json # Specify input and output files python bin_to_ics.py -i bin_data.json -o my_calendar.ics # Custom calendar name python bin_to_ics.py -i bin_data.json -o my_calendar.ics -n "My Bin Collections" ``` ### Options ``` --input, -i Input JSON file (if not provided, read from stdin) --output, -o Output ICS file (default: bin.ics) --name, -n Calendar name (default: Bin Collections) --alarms, -a Comma-separated list of alarm times before event (e.g., "1d,2h,30m") --no-all-day Create timed events instead of all-day events ``` ### Examples #### Adding Reminders (Alarms) Add reminders 1 day and 2 hours before each collection: ```bash python bin_to_ics.py -i bin_data.json -a "1d,2h" ``` The time format supports: - Days: `1d`, `2day`, `3days` - Hours: `1h`, `2hour`, `3hours` - Minutes: `30m`, `45min`, `60mins`, `90minutes` #### Creating Timed Events By default, events are created as all-day events. To create timed events instead (default time: 7:00 AM): ```bash python bin_to_ics.py -i bin_data.json --no-all-day ``` ### Integration with Bin Collection Data Retriever You can pipe the output from the bin collection data retriever directly to the ICS generator. The required parameters (postcode, house number, UPRN, etc.) depend on the specific council implementation - refer to the [Quickstart](#quickstart) section above or check the [project wiki](https://github.com/robbrad/UKBinCollectionData/wiki) for details about your council. ```bash python uk_bin_collection/uk_bin_collection/collect_data.py CouncilName "URL" [OPTIONS] | python bin_to_ics.py [OPTIONS] ``` #### Complete Example for a Council ```bash python uk_bin_collection/uk_bin_collection/collect_data.py CouncilName \ "council_url" \ -p "YOUR_POSTCODE" \ -n "YOUR_HOUSE_NUMBER" \ -w "http://localhost:4444/wd/hub" | python bin_to_ics.py \ --name "My Bin Collections" \ --output my_bins.ics \ --alarms "1d,12h" ``` This will: 1. Fetch bin collection data for your address from your council's website 2. Convert it to an ICS file named "my_bins.ics" 3. Set the calendar name to "My Bin Collections" 4. Add reminders 1 day and 12 hours before each collection For postcode lookup and UPRN information, please check the [UPRN Finder](#uprn-finder) section above. ### Using the Calendar You have two options for using the generated ICS file: #### 1. Importing the Calendar You can directly import the ICS file into your calendar application: - **Google Calendar**: Go to Settings > Import & export > Import - **Apple Calendar**: File > Import - **Microsoft Outlook**: File > Open & Export > Import/Export > Import an iCalendar (.ics) Note: Importing creates a static copy of the calendar events. If bin collection dates change, you'll need to re-import the calendar. #### 2. Subscribing to the Calendar If you host the ICS file on a publicly accessible web server, you can subscribe to it as an internet calendar: - **Google Calendar**: Go to "Other calendars" > "+" > "From URL" > Enter the URL of your hosted ICS file - **Apple Calendar**: File > New Calendar Subscription > Enter the URL - **Microsoft Outlook**: File > Account Settings > Internet Calendars > New > Enter the URL Benefits of subscribing: - Calendar automatically updates when the source file changes - No need to manually re-import when bin collection dates change - Easily share the calendar with household members You can set up a cron job or scheduled task to regularly: 1. Retrieve the latest bin collection data 2. Generate a fresh ICS file 3. Publish it to a web-accessible location ### Additional Examples and Use Cases #### Automation with Cron Jobs Create a weekly update script on a Linux/Mac system: ```bash #!/bin/bash # File: update_bin_calendar.sh # Set variables COUNCIL="YourCouncilName" COUNCIL_URL="https://your-council-website.gov.uk/bins" POSTCODE="YOUR_POSTCODE" HOUSE_NUMBER="YOUR_HOUSE_NUMBER" OUTPUT_DIR="/var/www/html/calendars" # Web-accessible directory CALENDAR_NAME="Household Bins" # Ensure output directory exists mkdir -p $OUTPUT_DIR # Run the collector and generate the calendar cd /path/to/UKBinCollectionData && \ python uk_bin_collection/uk_bin_collection/collect_data.py $COUNCIL "$COUNCIL_URL" \ -p "$POSTCODE" -n "$HOUSE_NUMBER" | \ python bin_to_ics.py --name "$CALENDAR_NAME" --output "$OUTPUT_DIR/bins.ics" --alarms "1d,6h" # Add timestamp to show last update time echo "Calendar last updated: $(date)" > "$OUTPUT_DIR/last_update.txt" ``` Make the script executable: ```bash chmod +x update_bin_calendar.sh ``` Add to crontab to run weekly (every Monday at 2 AM): ```bash 0 2 * * 1 /path/to/update_bin_calendar.sh ``` **Google Assistant/Alexa Integration** If you have your calendar connected to Google Calendar or Outlook, you can ask your smart assistant about upcoming bin collections: - "Hey Google, when is my next bin collection?" - "Alexa, what's on my calendar tomorrow?" (will include bin collections) ## Docker API Server We have created an API for this located under [uk_bin_collection_api_server](https://github.com/robbrad/UKBinCollectionData/uk_bin_collection_api_server) ### Prerequisites - Docker installed on your machine. - Python (if you plan to run the API locally without Docker). ### Running the API with Docker 1. Clone this repository. 2. Navigate to the `uk_bin_collection_api_server` directory of the project. #### Build the Docker Container ```bash docker build -t ukbc_api_server . ``` ``` docker run -p 8080:8080 ukbc_api_server ``` #### Accessing the API Once the Docker container is running, you can access the API endpoints: API Base URL: http://localhost:8080/api Swagger UI: http://localhost:8080/api/ui/ #### API Documentation The API documentation can be accessed via the Swagger UI. Use the Swagger UI to explore available endpoints, test different requests, and understand the API functionalities. ![Swagger UI](SwaggerUI.png) #### API Endpoints `GET /bin_collection/{council}` Description: Retrieves information about bin collections for the specified council. Parameters: council (required): Name of the council. Other optional parameters: [Specify optional parameters if any] Example Request: ```bash curl -X GET "http://localhost:8080/api/bin_collection/{council}" -H "accept: application/json" ``` ## Docker Compose This includes the Selenium standalone-chrome for Selenium-based councils. ```yaml version: '3' services: ukbc_api_server: build: context: . dockerfile: Dockerfile ports: - "8080:8080" # Adjust the ports as needed depends_on: - selenium selenium: image: selenium/standalone-chrome:latest ports: - "4444:4444" ``` ### Run with ```bash sudo apt-get update sudo apt-get install docker-compose docker-compose up ``` --- ## FAQ #### I've got an issue/support question—what do I do? Please post in the [HomeAssistant thread](https://community.home-assistant.io/t/bin-waste-collection/55451) or raise a new (non-council request) [issue](https://github.com/robbrad/UKBinCollectionData/issues/new). #### I'd like to contribute, where do I start? Contributions are always welcome! See ```CONTRIBUTING.md``` to get started. Please adhere to the project's [code of conduct](https://github.com/robbrad/UKBinCollectionData/blob/master/CODE_OF_CONDUCT.md). - If you're new to coding/Python/BeautifulSoup, feel free to check [here](https://github.com/robbrad/UKBinCollectionData/issues?q=is%3Aissue+is%3Aopen+label%3A%22good+first+issue%22) for issues that are good for newcomers! - If you would like to try writing your own scraper, feel free to fork this project and use existing scrapers as a base for your approach (or `councilclasstemplate.py`). ## Contributors Image of contributors ================================================ FILE: behave.ini ================================================ [behave] #default_tags = not (@xfail or @not_implemented) show_skipped = true format = rerun progress3 outfiles = rerun.txt build/behave.reports/report_progress3.txt junit = true junit_directory = build/behave.reports logging_level = INFO # -- HTML-FORMATTER REQUIRES: pip install behave-html-formatter # SEE ALSO: https://github.com/behave-contrib/behave-html-formatter [behave.formatters] html = behave_html_formatter:HTMLFormatter ================================================ FILE: bin_to_ics.py ================================================ #!/usr/bin/env python3 """ Script to convert UK Bin Collection Data to ICS calendar file. Takes JSON output from the bin collection data retriever and creates calendar events for each collection date. The events are saved to an ICS file that can be imported into calendar applications. Features: - Creates all-day events for bin collections by default - Optional alarms/reminders before collection days - Groups multiple bin collections on the same day into one event """ import argparse import datetime import json import os import sys from typing import Dict, List, Optional, Union try: from icalendar import Calendar, Event, Alarm except ImportError: print("Error: Required package 'icalendar' not found.") print("Please install it with: pip install icalendar") sys.exit(1) def parse_time_delta(time_str: str) -> datetime.timedelta: """ Parse a time string into a timedelta object. Formats supported: - "1d" or "1day" or "1days" for days - "2h" or "2hour" or "2hours" for hours - "30m" or "30min" or "30mins" or "30minutes" for minutes Args: time_str: String representing a time duration Returns: timedelta object representing the duration """ time_str = time_str.lower().strip() # Handle days if time_str.endswith('d') or time_str.endswith('day') or time_str.endswith('days'): if time_str.endswith('days'): value = int(time_str[:-4]) elif time_str.endswith('day'): value = int(time_str[:-3]) else: value = int(time_str[:-1]) return datetime.timedelta(days=value) # Handle hours elif time_str.endswith('h') or time_str.endswith('hour') or time_str.endswith('hours'): if time_str.endswith('hours'): value = int(time_str[:-5]) elif time_str.endswith('hour'): value = int(time_str[:-4]) else: value = int(time_str[:-1]) return datetime.timedelta(hours=value) # Handle minutes elif time_str.endswith('m') or time_str.endswith('min') or time_str.endswith('mins') or time_str.endswith('minutes'): if time_str.endswith('minutes'): value = int(time_str[:-7]) elif time_str.endswith('mins'): value = int(time_str[:-4]) elif time_str.endswith('min'): value = int(time_str[:-3]) else: value = int(time_str[:-1]) return datetime.timedelta(minutes=value) # Default to hours if no unit specified else: try: value = int(time_str) return datetime.timedelta(hours=value) except ValueError: raise ValueError(f"Invalid time format: {time_str}. Use format like '1d', '2h', or '30m'.") def create_bin_calendar( bin_data: Dict, calendar_name: str = "Bin Collections", alarm_times: Optional[List[datetime.timedelta]] = None, all_day: bool = True ) -> Calendar: """ Create a calendar from bin collection data. Args: bin_data: Dictionary containing bin collection data calendar_name: Name of the calendar alarm_times: List of timedeltas for when reminders should trigger before the event all_day: Whether the events should be all-day events Returns: Calendar object with events for each bin collection """ cal = Calendar() cal.add('prodid', '-//UK Bin Collection Data//bin_to_ics.py//EN') cal.add('version', '2.0') cal.add('name', calendar_name) cal.add('x-wr-calname', calendar_name) # Process bin collection data if 'bins' not in bin_data: print("Error: Invalid bin data format. 'bins' key not found.") sys.exit(1) # Group collections by date to combine bins collected on the same day collections_by_date = {} for bin_info in bin_data['bins']: if 'type' not in bin_info or 'collectionDate' not in bin_info: continue bin_type = bin_info['type'] collection_date_str = bin_info['collectionDate'] # Convert date string to datetime object try: # Expecting format DD/MM/YYYY collection_date = datetime.datetime.strptime(collection_date_str, "%d/%m/%Y").date() except ValueError: print(f"Warning: Unable to parse date '{collection_date_str}'. Skipping.") continue # Add to collections by date if collection_date not in collections_by_date: collections_by_date[collection_date] = [] collections_by_date[collection_date].append(bin_type) # Create events for each collection date for collection_date, bin_types in collections_by_date.items(): event = Event() # Join multiple bin types into one summary if needed bin_types_str = ", ".join(bin_types) # Create event summary and description summary = f"Bin Collection: {bin_types_str}" description = f"Collection for: {bin_types_str}" # Add event details event.add('summary', summary) event.add('description', description) # Set the event as all-day if requested if all_day: event.add('dtstart', collection_date) event.add('dtend', collection_date + datetime.timedelta(days=1)) else: # Default to 7am for non-all-day events collection_datetime = datetime.datetime.combine( collection_date, datetime.time(7, 0, 0) ) event.add('dtstart', collection_datetime) event.add('dtend', collection_datetime + datetime.timedelta(hours=1)) # Add alarms if specified if alarm_times: for alarm_time in alarm_times: alarm = create_alarm(trigger_before=alarm_time) event.add_component(alarm) # Generate a unique ID for the event event_id = f"bin-collection-{collection_date.isoformat()}-{hash(bin_types_str) % 10000:04d}@ukbincollection" event.add('uid', event_id) # Add the event to the calendar cal.add_component(event) return cal def create_alarm(trigger_before: datetime.timedelta) -> Alarm: """ Create an alarm component for calendar events. Args: trigger_before: How long before the event to trigger the alarm Returns: Alarm component """ alarm = Alarm() alarm.add('action', 'DISPLAY') alarm.add('description', 'Bin collection reminder') alarm.add('trigger', -trigger_before) return alarm def save_calendar(calendar: Calendar, output_file: str) -> None: """ Save a calendar to an ICS file. Args: calendar: Calendar object to save output_file: Path to save the calendar file """ with open(output_file, 'wb') as f: f.write(calendar.to_ical()) print(f"Calendar saved to {output_file}") def load_json_data(input_file: Optional[str] = None) -> Dict: """ Load bin collection data from JSON file or stdin. Args: input_file: Path to JSON file (if None, read from stdin) Returns: Dictionary containing bin collection data """ if input_file: try: with open(input_file, 'r') as f: return json.load(f) except (json.JSONDecodeError, FileNotFoundError) as e: print(f"Error reading input file: {e}") sys.exit(1) else: try: return json.load(sys.stdin) except json.JSONDecodeError as e: print(f"Error parsing JSON from stdin: {e}") sys.exit(1) def main(): parser = argparse.ArgumentParser(description='Convert UK Bin Collection Data to ICS calendar file.') parser.add_argument('--input', '-i', help='Input JSON file (if not provided, read from stdin)') parser.add_argument('--output', '-o', help='Output ICS file (default: bin.ics)', default='bin.ics') parser.add_argument('--name', '-n', help='Calendar name (default: Bin Collections)', default='Bin Collections') parser.add_argument('--alarms', '-a', help='Comma-separated list of alarm times before event (e.g., "1d,2h,30m")') parser.add_argument('--no-all-day', action='store_true', help='Create timed events instead of all-day events') args = parser.parse_args() # Parse alarm times alarm_times = None if args.alarms: alarm_times = [] for alarm_str in args.alarms.split(','): try: alarm_times.append(parse_time_delta(alarm_str.strip())) except ValueError as e: print(f"Warning: {e}") # Load bin collection data bin_data = load_json_data(args.input) # Create calendar calendar = create_bin_calendar( bin_data, args.name, alarm_times=alarm_times, all_day=not args.no_all_day ) # Save calendar to file save_calendar(calendar, args.output) if __name__ == '__main__': main() ================================================ FILE: conftest.py ================================================ # conftest.py import pytest from _pytest.config.argparsing import Parser from _pytest.fixtures import FixtureRequest from homeassistant.core import HomeAssistant from unittest.mock import AsyncMock, MagicMock, patch def pytest_addoption(parser: Parser) -> None: parser.addoption("--headless", action="store", default="True", type=str) parser.addoption("--local_browser", action="store", default="False", type=str) parser.addoption("--selenium_url", action="store", default="http://localhost:4444", type=str) @pytest.fixture(scope='session') def headless_mode(request: FixtureRequest) -> str: return request.config.getoption("--headless") @pytest.fixture(scope='session') def local_browser(request: FixtureRequest) -> str: return request.config.getoption("--local_browser") @pytest.fixture(scope='session') def selenium_url(request: FixtureRequest) -> str: return request.config.getoption("--selenium_url") @pytest.fixture def hass(): """Mock HomeAssistant instance.""" hass = MagicMock(spec=HomeAssistant) # Mock the event loop with create_task as AsyncMock hass.loop = MagicMock() hass.loop.create_task = AsyncMock() # Mock config_entries and its flow hass.config_entries = MagicMock() hass.config_entries.flow = MagicMock() # Mock asynchronous methods with AsyncMock hass.config_entries.flow.async_init = AsyncMock() hass.config_entries.flow.async_configure = AsyncMock() # Mock async_get_entry to return a MockConfigEntry when called hass.config_entries.async_get_entry = AsyncMock() # Mock async_unload as an AsyncMock hass.config_entries.async_unload = AsyncMock(return_value=True) # Mock async_block_till_done as an AsyncMock hass.async_block_till_done = AsyncMock() hass.async_add_executor_job = AsyncMock() # Ensure compatibility with async calls return hass @pytest.fixture def enable_custom_integrations(): """Fixture to enable custom integrations.""" with patch("homeassistant.helpers.discovery.load_platform") as mock_load: yield mock_load ================================================ FILE: custom_components/__init__.py ================================================ ================================================ FILE: custom_components/uk_bin_collection/README.md ================================================ # UK Bin Collection Integration Configuration This integration allows you to configure the collection details for your local UK council. The configuration flow is divided into several steps, and some fields are dynamically shown based on your selected council’s requirements. --- ## Table of Contents - [UK Bin Collection Integration Configuration](#uk-bin-collection-integration-configuration) - [Table of Contents](#table-of-contents) - [Step 1: Basic Setup](#step-1-basic-setup) - [Step 2: Council-Specific Details](#step-2-council-specific-details) - [Selenium \& Chromium Checks](#selenium--chromium-checks) - [Reconfiguration / Options Flow](#reconfiguration--options-flow) - [Validation Requirements](#validation-requirements) - [Icon Color Mapping JSON Example](#icon-color-mapping-json-example) - [Service: `uk_bin_collection.manual_refresh`](#service-uk_bin_collectionmanual_refresh) - [Service Data](#service-data) - [How the Service Works](#how-the-service-works) - [Example Automation to Refresh Bin Data (Manual Refresh Mode)](#example-automation-to-refresh-bin-data-manual-refresh-mode) --- ## Step 1: Basic Setup In the initial configuration step you must provide the following: | Field Name | Requirement | Type | Description | |-------------------------|-------------|---------|-------------| | **name** | Required | String | A unique identifier for your configuration entry. This is used to distinguish different configurations. | | **council** | Required | Select | A drop-down selection that displays available councils by their *wiki name*. Your selection will later be mapped to the corresponding council key. | | **manual_refresh_only** | Optional | Boolean | If checked, only manual refreshes will be performed. Defaults to `False`. | | **icon_color_mapping** | Optional | String | A text field for entering a JSON-formatted mapping for icon colors. If provided, the JSON must be valid. | > **Note:** The list of available councils is dynamically loaded from an external data source. --- ## Step 2: Council-Specific Details After you provide the basic details, the next step requests council-specific information. The fields displayed depend on the selected council’s requirements. Below is a summary of possible fields: | Field Name | Requirement | Type | Description | |--------------------|------------------------------|---------|-------------| | **url** | Required (if applicable) | String | The URL to access the bin collection data. Some councils require this field; however, if the council’s configuration has `skip_get_url` enabled, this field may be pre-filled or skipped. | | **uprn** | Required (if applicable) | String | The Unique Property Reference Number, if the council supports it. | | **postcode** | Required (if applicable) | String | The postcode for the address in question. | | **number** | Required (if applicable) | String | The house number. (This corresponds to the `"house_number"` key in the council configuration.) | | **usrn** | Required (if applicable) | String | The Unique Street Reference Number, if required by the council. | | **web_driver** | Optional (if applicable) | String | If the council requires Selenium for data fetching, you may provide the web driver command. | | **headless** | Optional (if applicable) | Boolean | Indicates whether to run the browser in headless mode (default is `True`). Only shown if `web_driver` is applicable. | | **local_browser** | Optional (if applicable) | Boolean | Choose whether to use a local browser instance (default is `False`). Only shown if `web_driver` is applicable. | | **timeout** | Optional | Integer | Sets the request timeout in seconds. Defaults to `60` seconds and must be at least `10`. | | **update_interval**| Optional | Integer | The refresh frequency in hours. Defaults to `12` hours and must be at least `1`. | ### Selenium & Chromium Checks For councils that require Selenium (i.e. if the council configuration contains a `"web_driver"` key): - **Selenium Server Check:** The integration checks several remote Selenium server URLs (and an optional custom URL, if provided) to determine if they are accessible. The results are displayed as part of the informational message. - **Chromium Installation Check:** A check is performed to ensure that a local Chromium browser is installed. The result is shown to help troubleshoot if Selenium is required. The combined status of these checks is presented as an HTML-formatted message in the council-specific form. --- ## Reconfiguration / Options Flow If you need to update your configuration later, you can do so via the options (or reconfiguration) flow. The following fields are available for editing: | Field Name | Requirement | Type | Description | |-------------------------|-------------|---------|-------------| | **name** | Required | String | The identifier for the configuration entry. | | **council** | Required | Select | A drop-down list to select your council (displayed by its *wiki name*). | | **manual_refresh_only** | Optional | Boolean | If enabled, the system will perform only manual refreshes. | | **update_interval** | Required | Integer | The refresh frequency in hours (must be at least 1). If manual refresh is enabled, this will be set to `None`. | | **icon_color_mapping** | Optional | String | A JSON-formatted string for mapping icon colors. Must be valid JSON if provided. | > **Additional Fields:** Depending on your initial configuration and the council selected, you may also be able to update fields such as **url**, **uprn**, **postcode**, **number**, **web_driver**, **headless**, **local_browser**, and **timeout**. Once you submit the updated options, the integration will reload the configuration with the new settings. --- ## Validation Requirements - **Unique Name & Duplicate Check:** The system checks to ensure that the provided `name` or combination of `council` and `url` is unique. If a duplicate entry exists, an error is shown. - **JSON Format:** Any input provided in the **icon_color_mapping** field must be valid JSON. If the JSON is invalid, you will be prompted to correct the input. - **Numeric Ranges:** - **Timeout:** Must be an integer and at least `10` seconds. - **Update Interval:** Must be an integer and at least `1` hour. - **Council-Specific Fields:** The required fields in the council-specific step (such as `url`, `uprn`, `postcode`, etc.) depend on the selected council's configuration. Only the fields relevant to the chosen council will be presented. --- ## Icon Color Mapping JSON Example Below is an example of a valid JSON configuration for the **icon_color_mapping** field. This mapping allows you to customize the icons and colors for different bin types in the sensor platform. The bin name **must match** the name of the bin returned from the council. ```json { "general": { "icon": "mdi:trash-can", "color": "green" }, "recycling": { "icon": "mdi:recycle", "color": "blue" }, "food": { "icon": "mdi:food", "color": "red" }, "garden": { "icon": "mdi:leaf", "color": "brown" } } ``` ## Service: `uk_bin_collection.manual_refresh` This service triggers a manual refresh of the bin collection data for a specific configuration entry. It is particularly useful when your integration is set to **manual refresh only** (i.e., when the `manual_refresh_only` option is enabled in your configuration). When called, the service will instruct the data coordinator to fetch the latest bin collection data immediately. ### Service Data | Field | Type | Description | |-----------|--------|-------------| | `entry_id` | String | **Required.** The unique identifier of the configuration entry. You can find this value in the integration details or in Home Assistant's configuration registry. | ### How the Service Works 1. **Input Verification:** The service checks whether the `entry_id` is provided in the service call data. 2. **Configuration Entry Lookup:** It verifies that a configuration entry exists for the provided `entry_id` in Home Assistant's data storage. 3. **Coordinator Check:** It ensures that the corresponding data coordinator (which is responsible for fetching bin collection data) is available. 4. **Data Refresh:** The service calls `async_request_refresh()` on the coordinator to fetch the latest data. If any of these steps fail, error messages will be logged to help diagnose the issue. --- ## Example Automation to Refresh Bin Data (Manual Refresh Mode) Below is an example automation that triggers a manual refresh of the bin collection data every day at 7:00 AM. This is useful if your integration is configured for manual refresh only. Be sure to replace `"YOUR_CONFIG_ENTRY_ID"` with the actual entry ID of your configuration. ```yaml automation: - alias: "Daily Manual Refresh for UK Bin Collection" description: "Triggers a manual refresh of the bin collection data every day at 7 AM for integrations set to manual refresh only." trigger: - platform: time at: "07:00:00" action: - service: uk_bin_collection.manual_refresh data: entry_id: "YOUR_CONFIG_ENTRY_ID" mode: single ``` ================================================ FILE: custom_components/uk_bin_collection/__init__.py ================================================ """The UK Bin Collection integration.""" import asyncio import logging from datetime import timedelta import json from homeassistant.config_entries import ConfigEntry from homeassistant.core import HomeAssistant from homeassistant.exceptions import ConfigEntryNotReady from homeassistant.helpers.update_coordinator import DataUpdateCoordinator, UpdateFailed from datetime import datetime from homeassistant.util import dt as dt_util from .const import DOMAIN, LOG_PREFIX, PLATFORMS, EXCLUDED_ARG_KEYS from uk_bin_collection.uk_bin_collection.collect_data import UKBinCollectionApp from homeassistant.helpers import config_validation as cv PLATFORM_SCHEMA = cv.platform_only_config_schema _LOGGER = logging.getLogger(__name__) async def async_setup(hass: HomeAssistant, config: dict) -> bool: """Set up the UK Bin Collection component.""" _LOGGER.debug(f"{LOG_PREFIX} async_setup called with config: {config}") try: hass.data.setdefault(DOMAIN, {}) _LOGGER.debug( f"{LOG_PREFIX} hass.data[DOMAIN] initialized: {hass.data[DOMAIN]}" ) async def handle_manual_refresh(call): """Refresh all bin sensors for a given config entry.""" _LOGGER.debug( f"{LOG_PREFIX} manual_refresh service called with data: {call.data}" ) entry_id = call.data.get("entry_id") if not entry_id: _LOGGER.error( "[UKBinCollection] No 'entry_id' was passed to uk_bin_collection.manual_refresh service." ) return if entry_id not in hass.data[DOMAIN]: _LOGGER.error( "[UKBinCollection] No config entry found for entry_id: %s", entry_id ) return coordinator = hass.data[DOMAIN][entry_id].get("coordinator") if not coordinator: _LOGGER.error( "[UKBinCollection] Coordinator is missing for entry_id: %s", entry_id, ) return _LOGGER.debug( "[UKBinCollection] About to request a manual refresh via coordinator" ) await coordinator.async_request_refresh() _LOGGER.debug("[UKBinCollection] Manual refresh completed") # Register a service named `uk_bin_collection.manual_refresh` _LOGGER.debug("[UKBinCollection] Registering manual_refresh service") hass.services.async_register( DOMAIN, "manual_refresh", handle_manual_refresh # The service name ) _LOGGER.debug( "[UKBinCollection] manual_refresh service registered successfully" ) _LOGGER.info("[UKBinCollection] async_setup completed without errors.") return True except Exception as exc: _LOGGER.exception("%s Unexpected error in async_setup: %s", LOG_PREFIX, exc) return False async def async_migrate_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool: """Migrate old config entries to new version.""" try: _LOGGER.debug( f"{LOG_PREFIX} async_migrate_entry called for entry_id={config_entry.entry_id}, version={config_entry.version}" ) if config_entry.version == 1: _LOGGER.info( f"{LOG_PREFIX} Migrating config entry {config_entry.entry_id} from version 1 to 2." ) data = config_entry.data.copy() if "update_interval" not in data: data["update_interval"] = 12 _LOGGER.debug( f"{LOG_PREFIX} 'update_interval' not found. Setting default to 12 hours." ) else: _LOGGER.debug( f"{LOG_PREFIX} 'update_interval' found: {data['update_interval']} hours." ) hass.config_entries.async_update_entry(config_entry, data=data) _LOGGER.info( f"{LOG_PREFIX} Migration of config entry {config_entry.entry_id} to version 2 successful." ) else: _LOGGER.debug( f"{LOG_PREFIX} No migration needed for entry_id={config_entry.entry_id}" ) return True except Exception as exc: _LOGGER.exception( "%s Unexpected error during async_migrate_entry: %s", LOG_PREFIX, exc ) return False async def async_setup_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool: """Set up UK Bin Collection from a config entry.""" _LOGGER.info( f"{LOG_PREFIX} async_setup_entry called for entry_id={config_entry.entry_id}" ) try: name = config_entry.data.get("name") if not name: _LOGGER.error(f"{LOG_PREFIX} 'name' is missing in config entry.") raise ConfigEntryNotReady("Missing 'name' in configuration.") timeout = config_entry.data.get("timeout", 60) manual_refresh = config_entry.data.get("manual_refresh_only", False) icon_color_mapping = config_entry.data.get("icon_color_mapping", "{}") update_interval_hours = config_entry.data.get("update_interval", 12) _LOGGER.debug( f"{LOG_PREFIX} Retrieved configuration: " f"name={name}, timeout={timeout}, " f"manual_refresh_only={manual_refresh}, " f"update_interval={update_interval_hours} hours, " f"icon_color_mapping={icon_color_mapping}" ) # Validate 'timeout' try: timeout = int(timeout) if timeout < 10: _LOGGER.warning( f"{LOG_PREFIX} Timeout value {timeout} is less than 10. Setting to 10 seconds." ) timeout = 10 except (ValueError, TypeError): _LOGGER.warning( f"{LOG_PREFIX} Invalid timeout value: {timeout}. Using default 60 seconds." ) timeout = 60 # Decide update interval based on manual_refresh if manual_refresh: try: update_interval_hours = int(update_interval_hours) if update_interval_hours < 1: update_interval_hours = 12 except (ValueError, TypeError): update_interval_hours = 12 update_interval = timedelta(hours=update_interval_hours) _LOGGER.info( "%s Automatic refresh every %s hour(s).", LOG_PREFIX, update_interval_hours, ) else: update_interval = None _LOGGER.info( "%s Manual refresh only: no automatic updates scheduled.", LOG_PREFIX ) # Prepare arguments for UKBinCollectionApp args = build_ukbcd_args(config_entry.data) _LOGGER.debug(f"{LOG_PREFIX} UKBinCollectionApp args: {args}") # Initialize the UK Bin Collection Data application ukbcd = UKBinCollectionApp() ukbcd.set_args(args) _LOGGER.debug(f"{LOG_PREFIX} UKBinCollectionApp initialized and arguments set.") # Initialize the data coordinator coordinator = HouseholdBinCoordinator( hass, ukbcd, name, timeout=timeout, update_interval=update_interval, ) _LOGGER.debug( f"{LOG_PREFIX} HouseholdBinCoordinator initialized with update_interval={update_interval}." ) # Perform first refresh await coordinator.async_config_entry_first_refresh() _LOGGER.info( f"{LOG_PREFIX} Initial data fetched successfully for entry_id={config_entry.entry_id}" ) # Store the coordinator in Home Assistant's data hass.data[DOMAIN][config_entry.entry_id] = {"coordinator": coordinator} _LOGGER.debug( f"{LOG_PREFIX} Coordinator stored in hass.data under entry_id={config_entry.entry_id}" ) # Forward the setup to all platforms (sensor and calendar) _LOGGER.debug(f"{LOG_PREFIX} Forwarding setup to platforms: {PLATFORMS}") await hass.config_entries.async_forward_entry_setups(config_entry, PLATFORMS) _LOGGER.info( f"{LOG_PREFIX} async_setup_entry finished for entry_id={config_entry.entry_id}" ) return True except UpdateFailed as e: _LOGGER.error(f"{LOG_PREFIX} Unable to fetch initial data: {e}") raise ConfigEntryNotReady from e except Exception as exc: _LOGGER.exception( "%s Unexpected error in async_setup_entry: %s", LOG_PREFIX, exc ) raise ConfigEntryNotReady from exc async def async_unload_entry(hass: HomeAssistant, config_entry: ConfigEntry) -> bool: """Unload a config entry.""" _LOGGER.info(f"{LOG_PREFIX} Unloading config entry {config_entry.entry_id}") unload_ok = True try: for platform in PLATFORMS: platform_unload_ok = await hass.config_entries.async_forward_entry_unload( config_entry, platform ) if not platform_unload_ok: _LOGGER.warning( f"{LOG_PREFIX} Failed to unload '{platform}' platform for entry_id={config_entry.entry_id}" ) unload_ok = False else: _LOGGER.debug( f"{LOG_PREFIX} Successfully unloaded '{platform}' for entry_id={config_entry.entry_id}" ) if unload_ok: hass.data[DOMAIN].pop(config_entry.entry_id, None) _LOGGER.debug( f"{LOG_PREFIX} Removed coordinator for entry_id={config_entry.entry_id}" ) else: _LOGGER.warning( f"{LOG_PREFIX} One or more platforms failed to unload for entry_id={config_entry.entry_id}" ) except Exception as exc: _LOGGER.exception( "%s Unexpected error in async_unload_entry: %s", LOG_PREFIX, exc ) unload_ok = False return unload_ok def build_ukbcd_args(config_data: dict) -> list: """Build the argument list for UKBinCollectionApp from config data.""" council = config_data.get("original_parser") or config_data.get("council", "") url = config_data.get("url", "") args = [council, url] # Per-key formatters: return a list of CLI args for that key def _format_headless(v): return ["--headless"] if v else ["--not-headless"] def _format_web_driver(v): return [f"--web_driver={v.rstrip('/')}"] if v is not None else [] formatters = { "headless": _format_headless, "web_driver": _format_web_driver, } for key, value in config_data.items(): if key in EXCLUDED_ARG_KEYS: continue fmt = formatters.get(key) if fmt: args.extend(fmt(value)) else: args.append(f"--{key}={value}") return args class HouseholdBinCoordinator(DataUpdateCoordinator): """Coordinator to manage fetching and updating UK Bin Collection data.""" def __init__( self, hass: HomeAssistant, ukbcd: UKBinCollectionApp, name: str, timeout: int = 60, update_interval: timedelta = timedelta(hours=12), ) -> None: """Initialize the data coordinator.""" super().__init__( hass, _LOGGER, name="UK Bin Collection Data", update_interval=update_interval, ) self.ukbcd = ukbcd self.name = name self.timeout = timeout self._last_good_data = {} _LOGGER.debug( f"{LOG_PREFIX} HouseholdBinCoordinator __init__: name={name}, timeout={timeout}, update_interval={update_interval}" ) async def _async_update_data(self) -> dict: """Fetch and process the latest bin collection data.""" _LOGGER.debug(f"{LOG_PREFIX} _async_update_data called.") _LOGGER.info( f"{LOG_PREFIX} Fetching latest bin collection data with timeout={self.timeout}" ) try: data = await asyncio.wait_for( self.hass.async_add_executor_job(self.ukbcd.run), timeout=self.timeout, ) _LOGGER.debug(f"{LOG_PREFIX} Raw data fetched from ukbcd.run(): {data}") parsed_data = json.loads(data) _LOGGER.debug(f"{LOG_PREFIX} JSON parsed data: {parsed_data}") processed_data = self.process_bin_data(parsed_data) if not processed_data: _LOGGER.warning( f"{LOG_PREFIX} No bin data found. Using last known good data." ) if self._last_good_data: return self._last_good_data else: _LOGGER.warning(f"{LOG_PREFIX} No previous data to fall back to.") return {} self._last_good_data = processed_data _LOGGER.debug(f"{LOG_PREFIX} Processed data: {processed_data}") _LOGGER.info(f"{LOG_PREFIX} Bin collection data updated successfully.") return processed_data except asyncio.TimeoutError as exc: _LOGGER.error(f"{LOG_PREFIX} Timeout while updating data: {exc}") raise UpdateFailed(f"Timeout while updating data: {exc}") from exc except json.JSONDecodeError as exc: _LOGGER.error(f"{LOG_PREFIX} JSON decode error: {exc}") raise UpdateFailed(f"JSON decode error: {exc}") from exc except Exception as exc: _LOGGER.exception(f"{LOG_PREFIX} Unexpected error: {exc}") raise UpdateFailed(f"Unexpected error: {exc}") from exc @staticmethod def process_bin_data(data: dict) -> dict: """Process raw data to determine the next collection dates.""" _LOGGER.debug(f"{LOG_PREFIX} process_bin_data called with data={data}") current_date = dt_util.now().date() next_collection_dates = {} bins = data.get("bins", []) _LOGGER.debug(f"{LOG_PREFIX} Bins found: {bins}") for bin_data in bins: bin_type = bin_data.get("type") collection_date_str = bin_data.get("collectionDate") _LOGGER.debug(f"{LOG_PREFIX} Processing bin_data={bin_data}") if not bin_type or not collection_date_str: _LOGGER.warning( f"{LOG_PREFIX} Missing 'type' or 'collectionDate' in bin data: {bin_data}" ) continue try: collection_date = datetime.strptime( collection_date_str, "%d/%m/%Y" ).date() except (ValueError, TypeError) as exc: _LOGGER.warning( f"{LOG_PREFIX} Invalid date format '{collection_date_str}' for bin type '{bin_type}'. Error: {exc}" ) continue if ( collection_date < current_date and current_date.month == 12 and collection_date.month == 1 ): collection_date = collection_date.replace(year=current_date.year + 1) _LOGGER.debug( f"{LOG_PREFIX} Corrected rollover year for '{bin_type}' to {collection_date}" ) existing_date = next_collection_dates.get(bin_type) if collection_date >= current_date and ( not existing_date or collection_date < existing_date ): next_collection_dates[bin_type] = collection_date _LOGGER.debug( f"{LOG_PREFIX} Updated next collection for '{bin_type}' to {collection_date}" ) _LOGGER.debug( f"{LOG_PREFIX} Final next_collection_dates={next_collection_dates}" ) return next_collection_dates ================================================ FILE: custom_components/uk_bin_collection/calendar.py ================================================ """Calendar platform support for UK Bin Collection Data.""" import logging import uuid from datetime import datetime, timedelta from typing import Any, Dict, List, Optional from homeassistant.components.calendar import CalendarEntity, CalendarEvent from homeassistant.config_entries import ConfigEntry from homeassistant.core import HomeAssistant, callback from homeassistant.helpers.entity_platform import AddEntitiesCallback from homeassistant.helpers.update_coordinator import ( CoordinatorEntity, DataUpdateCoordinator, ) from .const import DOMAIN, LOG_PREFIX _LOGGER = logging.getLogger(__name__) class UKBinCollectionCalendar(CoordinatorEntity, CalendarEntity): """Calendar entity for UK Bin Collection Data.""" def __init__( self, coordinator: DataUpdateCoordinator, bin_type: str, unique_id: str, name: str, ) -> None: """Initialize the calendar entity.""" super().__init__(coordinator) self._bin_type = bin_type self._unique_id = unique_id self._name = name self._attr_unique_id = unique_id # Optionally, set device_info if you have device grouping self._attr_device_info = { "identifiers": {(DOMAIN, unique_id)}, "name": f"{self._name} Device", "manufacturer": "UK Bin Collection", "model": "Bin Collection Calendar", "sw_version": "1.0", } @property def name(self) -> str: """Return the name of the calendar.""" return self._name @property def event(self) -> Optional[CalendarEvent]: """Return the next collection event.""" collection_date = self.coordinator.data.get(self._bin_type) if not collection_date: _LOGGER.debug( f"{LOG_PREFIX} No collection date available for '{self._bin_type}'." ) return None return self._create_calendar_event(collection_date) async def async_get_events( self, hass: HomeAssistant, start_date: datetime, end_date: datetime ) -> List[CalendarEvent]: """Return all events within a specific time frame.""" events: List[CalendarEvent] = [] collection_date = self.coordinator.data.get(self._bin_type) if not collection_date: return events # The test expects comparison between date parts. if start_date.date() <= collection_date <= end_date.date(): events.append(self._create_calendar_event(collection_date)) return events def _create_calendar_event(self, collection_date: datetime.date) -> CalendarEvent: """Create a CalendarEvent for a given collection date.""" return CalendarEvent( summary=f"{self._bin_type} Collection", start=collection_date, end=collection_date + timedelta(days=1), uid=f"{self.unique_id}_{collection_date.isoformat()}", ) @property def unique_id(self) -> str: """Return a unique ID for the calendar.""" return self._unique_id @property def available(self) -> bool: """Return if entity is available. The entity is considered available if the coordinator’s last update was successful and we have a valid collection date for the bin type. """ return self.coordinator.last_update_success and ( self.coordinator.data.get(self._bin_type) is not None ) @property def extra_state_attributes(self) -> Dict[str, Any]: """Return extra state attributes.""" return {} @callback def _handle_coordinator_update(self) -> None: """Handle updates from the coordinator and refresh calendar state.""" self.async_write_ha_state() async def async_setup_entry( hass: HomeAssistant, config_entry: ConfigEntry, async_add_entities: AddEntitiesCallback, ) -> None: """Set up UK Bin Collection Calendar from a config entry.""" _LOGGER.info(f"{LOG_PREFIX} Setting up UK Bin Collection Calendar platform.") # Retrieve the coordinator from hass.data coordinator: DataUpdateCoordinator = hass.data[DOMAIN][config_entry.entry_id][ "coordinator" ] # Wait for the first refresh. This will raise if the update fails. await coordinator.async_config_entry_first_refresh() # Create calendar entities only for bin types that have a valid date entities = [] for bin_type, collection_date in coordinator.data.items(): if collection_date is None: continue unique_id = calc_unique_calendar_id(config_entry.entry_id, bin_type) name = f"{coordinator.name} {bin_type} Calendar" entities.append( UKBinCollectionCalendar( coordinator=coordinator, bin_type=bin_type, unique_id=unique_id, name=name, ) ) # Register all calendar entities with Home Assistant async_add_entities(entities) _LOGGER.debug( f"{LOG_PREFIX} Calendar entities added: {[entity.name for entity in entities]}" ) async def async_unload_entry( hass: HomeAssistant, config_entry: ConfigEntry, async_remove_entities: Any, ) -> bool: """Unload a config entry.""" # Unloading is handled in init.py return True def calc_unique_calendar_id(entry_id: str, bin_type: str) -> str: """Calculate a unique ID for the calendar.""" return f"{entry_id}_{bin_type}_calendar" ================================================ FILE: custom_components/uk_bin_collection/config_flow.py ================================================ import json import logging import shutil import asyncio from typing import Any, Dict, Optional import aiohttp import homeassistant.helpers.config_validation as cv import voluptuous as vol from homeassistant import config_entries from homeassistant.core import callback import collections # At the top with other imports from .const import DOMAIN, LOG_PREFIX, SELENIUM_SERVER_URLS, BROWSER_BINARIES, INPUT_JSON_URL _LOGGER = logging.getLogger(__name__) class UkBinCollectionConfigFlow(config_entries.ConfigFlow, domain=DOMAIN): """Handle a config flow for UkBinCollection.""" VERSION = 3 # Incremented version for config flow changes def __init__(self): self.councils_data: Optional[Dict[str, Any]] = None self.data: Dict[str, Any] = {} self.council_names: list = [] self.council_options: list = [] self.selenium_checked: bool = False self.selenium_available: bool = False self.selenium_results: list = [] self.chromium_checked: bool = False self.chromium_installed: bool = False async def async_migrate_entry( self, config_entry: config_entries.ConfigEntry ) -> bool: """Migrate old entry to the new version with manual refresh ticked.""" _LOGGER.info("Migrating config entry from version %s", config_entry.version) data = dict(config_entry.data) if config_entry.version < 3: # If the manual_refresh_only key is not present, add it and set to True. if "manual_refresh_only" not in data: _LOGGER.info("Setting 'manual_refresh_only' to True in the migration") data["manual_refresh_only"] = True self.hass.config_entries.async_update_entry(config_entry, data=data) _LOGGER.info("Migration to version %s successful", self.VERSION) return True async def async_step_user(self, user_input: Optional[Dict[str, Any]] = None): """Handle the initial step.""" errors = {} if self.councils_data is None: self.councils_data = await self.get_councils_json() if not self.councils_data: _LOGGER.error("Council data is unavailable.") return self.async_abort(reason="Council Data Unavailable") self.council_names = list(self.councils_data.keys()) self.council_options = [ self.councils_data[name]["wiki_name"] for name in self.council_names ] _LOGGER.debug("Loaded council data: %s", self.council_names) if user_input is not None: _LOGGER.debug("User input received: %s", user_input) # Validate user input if not user_input.get("name"): errors["name"] = "Name is required." if not user_input.get("council"): errors["council"] = "Council is required." # Validate JSON mapping if provided if user_input.get("icon_color_mapping"): if not self.is_valid_json(user_input["icon_color_mapping"]): errors["icon_color_mapping"] = "Invalid JSON format." # Check for duplicate entries if not errors: existing_entry = await self._async_entry_exists(user_input) if existing_entry: errors["base"] = "duplicate_entry" _LOGGER.warning( "Duplicate entry found: %s", existing_entry.data.get("name") ) if not errors: # Map selected wiki_name back to council key council_key = self.map_wiki_name_to_council_key(user_input["council"]) if not council_key: errors["council"] = "Invalid council selected." return self.async_show_form( step_id="user", data_schema=..., errors=errors ) user_input["council"] = council_key # Add original_parser if it's an alias if "original_parser" in self.councils_data[council_key]: user_input["original_parser"] = self.councils_data[council_key][ "original_parser" ] user_input["council"] = council_key self.data.update(user_input) _LOGGER.debug("User input after mapping: %s", self.data) # Proceed to the council step return await self.async_step_council() # Show the initial form return self.async_show_form( step_id="user", data_schema=vol.Schema( { vol.Required("name"): cv.string, vol.Required("council"): vol.In(self.council_options), vol.Optional("manual_refresh_only", default=True): bool, vol.Optional("icon_color_mapping", default=""): cv.string, } ), errors=errors, description_placeholders={"cancel": "Press Cancel to abort setup."}, ) async def async_step_council(self, user_input: Optional[Dict[str, Any]] = None): """Second step to configure the council details.""" errors = {} council_key = self.data.get("council") council_info = self.councils_data.get(council_key, {}) requires_selenium = "web_driver" in council_info if user_input is not None: _LOGGER.debug("Council step user input: %s", user_input) # Validate JSON mapping if provided if user_input.get("icon_color_mapping"): if not self.is_valid_json(user_input["icon_color_mapping"]): errors["icon_color_mapping"] = "Invalid JSON format." # Handle 'skip_get_url' if necessary if council_info.get("skip_get_url", False): user_input["skip_get_url"] = True user_input["url"] = council_info.get("url", "") # Merge user_input with existing data self.data.update(user_input) # If no errors, create the config entry if not errors: _LOGGER.info( "%s Creating config entry with data: %s", LOG_PREFIX, self.data ) return self.async_create_entry(title=self.data["name"], data=self.data) else: _LOGGER.debug("Errors in council step: %s", errors) # Prepare description placeholders description_placeholders = {} if requires_selenium: description = await self.perform_selenium_checks(council_key) description_placeholders["selenium_message"] = description else: description_placeholders["selenium_message"] = "" # Show the form return self.async_show_form( step_id="council", data_schema=await self.get_council_schema(council_key), errors=errors, description_placeholders=description_placeholders, ) async def async_step_reconfigure(self, user_input: Optional[Dict[str, Any]] = None): """Handle reconfiguration of the integration.""" return await self.async_step_reconfigure_confirm() async def async_step_reconfigure_confirm( self, user_input: Optional[Dict[str, Any]] = None ): """Handle a reconfiguration flow initialized by the user.""" errors = {} existing_entry = self.hass.config_entries.async_get_entry( self.context["entry_id"] ) if existing_entry is None: _LOGGER.error("Reconfiguration failed: Config entry not found.") return self.async_abort(reason="Reconfigure Failed") if self.councils_data is None: self.councils_data = await self.get_councils_json() self.council_names = list(self.councils_data.keys()) self.council_options = [ self.councils_data[name]["wiki_name"] for name in self.council_names ] _LOGGER.debug("Loaded council data for reconfiguration.") council_key = existing_entry.data.get("council") council_info = self.councils_data.get(council_key, {}) council_wiki_name = council_info.get("wiki_name", "") if user_input is not None: _LOGGER.debug("Reconfigure user input: %s", user_input) # Map selected wiki_name back to council key council_key = self.map_wiki_name_to_council_key(user_input["council"]) user_input["council"] = council_key # Validate update_interval update_interval = user_input.get("update_interval") if update_interval is not None: try: update_interval = int(update_interval) if update_interval < 1: errors["update_interval"] = "Must be at least 1 hour." except ValueError: errors["update_interval"] = "Invalid number." # Validate JSON mapping if provided if user_input.get("icon_color_mapping"): if not self.is_valid_json(user_input["icon_color_mapping"]): errors["icon_color_mapping"] = "Invalid JSON format." if not errors: # Merge the user input with existing data data = {**existing_entry.data, **user_input} data["icon_color_mapping"] = user_input.get("icon_color_mapping", "") self.hass.config_entries.async_update_entry( existing_entry, title=user_input.get("name", existing_entry.title), data=data, ) # Trigger a data refresh by reloading the config entry await self.hass.config_entries.async_reload(existing_entry.entry_id) _LOGGER.info( "Configuration updated for entry: %s", existing_entry.entry_id ) return self.async_abort(reason="Reconfigure Successful") else: _LOGGER.debug("Errors in reconfiguration: %s", errors) # Build the schema with existing data schema = self.build_reconfigure_schema(existing_entry.data, council_wiki_name) return self.async_show_form( step_id="reconfigure_confirm", data_schema=schema, errors=errors, description_placeholders={"selenium_message": ""}, ) async def get_councils_json(self) -> Dict[str, Any]: """Fetch and return the supported councils data, including aliases and sorted alphabetically.""" try: async with aiohttp.ClientSession() as session: async with session.get(INPUT_JSON_URL) as response: response.raise_for_status() data_text = await response.text() original_data = json.loads(data_text) normalized_data = {} for key, value in original_data.items(): normalized_data[key] = value for alias in value.get("supported_councils", []): alias_data = value.copy() alias_data["original_parser"] = key alias_data["wiki_name"] = ( f"{alias.replace('Council', ' Council')} (via Google Calendar)" ) normalized_data[alias] = alias_data # Sort alphabetically by key (council ID) sorted_data = dict(sorted(normalized_data.items())) _LOGGER.debug( "Loaded and sorted %d councils (with aliases)", len(sorted_data) ) return sorted_data except Exception as e: _LOGGER.exception("Error fetching council data: %s", e) return {} async def get_council_schema(self, council: str) -> vol.Schema: """Generate the form schema based on council requirements.""" council_info = self.councils_data.get(council, {}) fields = {} if not council_info.get("skip_get_url", False) or council_info.get( "custom_component_show_url_field" ): fields[vol.Required("url")] = cv.string if "uprn" in council_info: fields[vol.Required("uprn")] = cv.string if "postcode" in council_info: fields[vol.Required("postcode")] = cv.string if "house_number" in council_info: fields[vol.Required("number")] = cv.string if "usrn" in council_info: fields[vol.Required("usrn")] = cv.string if "web_driver" in council_info: fields[vol.Optional("web_driver", default="")] = cv.string fields[vol.Optional("headless", default=True)] = bool fields[vol.Optional("local_browser", default=False)] = bool fields[vol.Optional("timeout", default=60)] = vol.All( vol.Coerce(int), vol.Range(min=10) ) fields[vol.Optional("update_interval", default=12)] = vol.All( cv.positive_int, vol.Range(min=1) ) return vol.Schema(fields) def build_reconfigure_schema( self, existing_data: Dict[str, Any], council_wiki_name: str ) -> vol.Schema: """Build the schema for reconfiguration with existing data.""" fields = { vol.Required("name", default=existing_data.get("name", "")): str, vol.Required("council", default=council_wiki_name): vol.In( self.council_options ), vol.Optional( "manual_refresh_only", default=existing_data.get("manual_refresh_only", False), ): bool, vol.Required( "update_interval", default=existing_data.get("update_interval", 12) ): vol.All(cv.positive_int, vol.Range(min=1)), } optional_fields = [ ("url", cv.string), ("uprn", cv.string), ("postcode", cv.string), ("number", cv.string), ("web_driver", cv.string), ("headless", bool), ("local_browser", bool), ("timeout", vol.All(vol.Coerce(int), vol.Range(min=10))), ] for field_name, validator in optional_fields: if field_name in existing_data: fields[vol.Optional(field_name, default=existing_data[field_name])] = ( validator ) fields[ vol.Optional( "icon_color_mapping", default=existing_data.get("icon_color_mapping", ""), ) ] = str return vol.Schema(fields) async def perform_selenium_checks(self, council_key: str) -> str: """Perform Selenium and Chromium checks and return a formatted message.""" messages = [] council_info = self.councils_data.get(council_key, {}) council_name = council_info.get("wiki_name", council_key) custom_selenium_url = self.data.get("selenium_url") selenium_results = await self.check_selenium_server(custom_selenium_url) self.selenium_available = any(accessible for _, accessible in selenium_results) self.selenium_checked = True self.chromium_installed = await self.check_chromium_installed() self.chromium_checked = True # Start building the message with formatted HTML messages.append(f"{council_name} requires Selenium to run.

") # Selenium server check results messages.append("Remote Selenium server URLs checked:
") for url, accessible in selenium_results: status = "✅ Accessible" if accessible else "❌ Not accessible" messages.append(f"{url}: {status}
") # Chromium installation check chromium_status = ( "✅ Installed" if self.chromium_installed else "❌ Not installed" ) messages.append("
Local Chromium browser check:
") messages.append(f"Chromium browser is {chromium_status}.") # Combine messages return "".join(messages) async def check_selenium_server(self, custom_url: Optional[str] = None) -> list: """Check if Selenium servers are accessible.""" urls = SELENIUM_SERVER_URLS.copy() if custom_url: urls.insert(0, custom_url) results = [] async with aiohttp.ClientSession() as session: for url in urls: try: async with session.get(url, timeout=5) as response: response.raise_for_status() accessible = response.status == 200 results.append((url, accessible)) _LOGGER.debug("Selenium server %s is accessible.", url) except aiohttp.ClientError as e: _LOGGER.warning( "Failed to connect to Selenium server at %s: %s", url, e ) results.append((url, False)) except Exception as e: _LOGGER.exception( "Unexpected error checking Selenium server at %s: %s", url, e ) results.append((url, False)) return results async def check_chromium_installed(self) -> bool: """Check if Chromium is installed.""" loop = asyncio.get_event_loop() result = await loop.run_in_executor(None, self._sync_check_chromium) if result: _LOGGER.debug("Chromium is installed.") else: _LOGGER.warning("Chromium is not installed.") return result def _sync_check_chromium(self) -> bool: """Synchronous check for Chromium installation.""" for exec_name in BROWSER_BINARIES: try: if shutil.which(exec_name): _LOGGER.debug(f"Found Chromium executable: {exec_name}") return True except Exception as e: _LOGGER.error( f"Exception while checking for executable '{exec_name}': {e}" ) continue # Continue checking other binaries _LOGGER.debug("No Chromium executable found.") return False def map_wiki_name_to_council_key(self, wiki_name: str) -> str: """Map the council wiki name back to the council key.""" try: index = self.council_options.index(wiki_name) council_key = self.council_names[index] _LOGGER.debug( "Mapped wiki name '%s' to council key '%s'.", wiki_name, council_key ) return council_key except ValueError: _LOGGER.error("Wiki name '%s' not found in council options.", wiki_name) return "" @staticmethod def is_valid_json(json_str: str) -> bool: """Validate if a string is valid JSON.""" try: json.loads(json_str) return True except json.JSONDecodeError as e: _LOGGER.debug("JSON decode error: %s", e) return False async def _async_entry_exists( self, user_input: Dict[str, Any] ) -> Optional[config_entries.ConfigEntry]: """Check if a config entry with the same name or data already exists.""" for entry in self._async_current_entries(): if entry.data.get("name") == user_input.get("name"): return entry if entry.data.get("council") == user_input.get( "council" ) and entry.data.get("url") == user_input.get("url"): return entry return None async def async_step_import( self, import_config: Dict[str, Any] ) -> config_entries.FlowResult: """Handle import from configuration.yaml.""" return await self.async_step_user(import_config) class UkBinCollectionOptionsFlowHandler(config_entries.OptionsFlow): """Handle options flow for UkBinCollection.""" def __init__(self, config_entry): """Initialize options flow.""" self.config_entry = config_entry self.councils_data: Optional[Dict[str, Any]] = None self.council_names: list = [] self.council_options: list = [] async def async_step_init(self, user_input=None): """Manage the options.""" errors = {} existing_data = self.config_entry.data # Fetch council data self.councils_data = await self.get_councils_json() if not self.councils_data: _LOGGER.error("Council data is unavailable for options flow.") return self.async_abort(reason="Council Data Unavailable") self.council_names = list(self.councils_data.keys()) self.council_options = [ self.councils_data[name]["wiki_name"] for name in self.council_names ] _LOGGER.debug("Loaded council data for options flow.") if user_input is not None: _LOGGER.debug("Options flow user input: %s", user_input) # Map selected wiki_name back to council key council_key = self.map_wiki_name_to_council_key(user_input["council"]) user_input["council"] = council_key # Validate update_interval update_interval = user_input.get("update_interval") if update_interval is not None: try: update_interval = int(update_interval) if update_interval < 1: errors["update_interval"] = "Must be at least 1 hour." except ValueError: errors["update_interval"] = "Invalid number." # Validate JSON mapping if provided if user_input.get("icon_color_mapping"): if not UkBinCollectionConfigFlow.is_valid_json( user_input["icon_color_mapping"] ): errors["icon_color_mapping"] = "Invalid JSON format." if user_input.get("manual_refresh_only"): user_input["update_interval"] = None if not errors: # Merge the user input with existing data data = {**existing_data, **user_input} data["icon_color_mapping"] = user_input.get("icon_color_mapping", "") self.hass.config_entries.async_update_entry( self.config_entry, data=data, ) # Trigger a data refresh by reloading the config entry await self.hass.config_entries.async_reload(self.config_entry.entry_id) _LOGGER.info("Options updated and config entry reloaded.") return self.async_create_entry(title="", data={}) else: _LOGGER.debug("Errors in options flow: %s", errors) # Build the form with existing data schema = self.build_options_schema(existing_data) return self.async_show_form( step_id="init", data_schema=schema, errors=errors, description_placeholders={"cancel": "Press Cancel to abort setup."}, ) async def get_councils_json(self) -> Dict[str, Any]: """Fetch and return the supported councils data.""" try: async with aiohttp.ClientSession() as session: async with session.get(INPUT_JSON_URL) as response: response.raise_for_status() data_text = await response.text() return json.loads(data_text) except aiohttp.ClientError as e: _LOGGER.error( "HTTP error while fetching council data for options flow: %s", e ) except json.JSONDecodeError as e: _LOGGER.error("Error decoding council data JSON for options flow: %s", e) except Exception as e: _LOGGER.exception( "Unexpected error while fetching council data for options flow: %s", e ) return {} def build_options_schema(self, existing_data: Dict[str, Any]) -> vol.Schema: """Build the schema for the options flow with existing data.""" council_current_key = existing_data.get("council", "") try: council_current_wiki = self.council_options[ self.council_names.index(council_current_key) ] except (ValueError, IndexError): council_current_wiki = "" fields = { vol.Required("name", default=existing_data.get("name", "")): str, vol.Required("council", default=council_current_wiki): vol.In( self.council_options ), vol.Optional("manual_refresh_only", default=False): bool, vol.Required( "update_interval", default=existing_data.get("update_interval", 12) ): vol.All(cv.positive_int, vol.Range(min=1)), } optional_fields = [ ("icon_color_mapping", cv.string), # Add other optional fields if necessary ] for field_name, validator in optional_fields: if field_name in existing_data: fields[vol.Optional(field_name, default=existing_data[field_name])] = ( validator ) return vol.Schema(fields) def map_wiki_name_to_council_key(self, wiki_name: str) -> str: """Map the council wiki name back to the council key.""" try: index = self.council_options.index(wiki_name) council_key = self.council_names[index] _LOGGER.debug( "Mapped wiki name '%s' to council key '%s'.", wiki_name, council_key ) return council_key except ValueError: _LOGGER.error("Wiki name '%s' not found in council options.", wiki_name) return "" @staticmethod def is_valid_json(json_str: str) -> bool: """Validate if a string is valid JSON.""" try: json.loads(json_str) return True except json.JSONDecodeError as e: _LOGGER.debug("JSON decode error in options flow: %s", e) return False async def async_get_options_flow(config_entry): """Get the options flow for this handler.""" return UkBinCollectionOptionsFlowHandler(config_entry) ================================================ FILE: custom_components/uk_bin_collection/const.py ================================================ """Constants for UK Bin Collection Data.""" from datetime import timedelta from homeassistant.const import Platform INPUT_JSON_URL = "https://raw.githubusercontent.com/robbrad/UKBinCollectionData/0.165.0/uk_bin_collection/tests/input.json" DEFAULT_NAME = "UK Bin Collection Data" DOMAIN = "uk_bin_collection" LOG_PREFIX = "[UKBinCollection]" PLATFORMS = [Platform.SENSOR] STATE_ATTR_COLOUR = "colour" STATE_ATTR_NEXT_COLLECTION = "next_collection" STATE_ATTR_DAYS = "days" DEVICE_CLASS = "bin_collection_schedule" PLATFORMS = ["sensor", "calendar"] SELENIUM_SERVER_URLS = ["http://localhost:4444", "http://selenium:4444"] BROWSER_BINARIES = ["chromium", "chromium-browser", "google-chrome"] EXCLUDED_ARG_KEYS = { "name", "council", "url", "skip_get_url", "local_browser", "timeout", "icon_color_mapping", "update_interval", "manual_refresh_only", "original_parser", } ================================================ FILE: custom_components/uk_bin_collection/manifest.json ================================================ { "domain": "uk_bin_collection", "name": "UK Bin Collection Data", "after_dependencies": [], "codeowners": ["@robbrad"], "config_flow": true, "dependencies": [], "documentation": "https://github.com/robbrad/UKBinCollectionData/wiki", "integration_type": "service", "iot_class": "cloud_polling", "issue_tracker": "https://github.com/robbrad/UKBinCollectionData/issues", "requirements": ["uk-bin-collection>=0.165.0"], "version": "0.165.0", "zeroconf": [] } ================================================ FILE: custom_components/uk_bin_collection/sensor.py ================================================ """Support for UK Bin Collection Data sensors.""" from datetime import datetime, timedelta import json import logging import asyncio from typing import Any, Dict from json import JSONDecodeError from homeassistant.core import HomeAssistant, callback from homeassistant.config_entries import ConfigEntry from homeassistant.components.sensor import SensorEntity from homeassistant.exceptions import ConfigEntryNotReady from homeassistant.helpers.update_coordinator import ( CoordinatorEntity, DataUpdateCoordinator, UpdateFailed, ) from homeassistant.util import dt as dt_util from homeassistant.helpers.entity_platform import AddEntitiesCallback import homeassistant.helpers.config_validation as cv from .const import ( DOMAIN, LOG_PREFIX, STATE_ATTR_DAYS, STATE_ATTR_NEXT_COLLECTION, DEVICE_CLASS, STATE_ATTR_COLOUR, PLATFORMS, ) from uk_bin_collection.uk_bin_collection.collect_data import UKBinCollectionApp _LOGGER = logging.getLogger(__name__) async def async_setup_entry( hass: HomeAssistant, config_entry: ConfigEntry, async_add_entities: AddEntitiesCallback, ) -> None: """Set up the UK Bin Collection Data sensor platform.""" _LOGGER.info(f"{LOG_PREFIX} Setting up UK Bin Collection Data platform.") # Retrieve the coordinator from hass.data coordinator: DataUpdateCoordinator = hass.data[DOMAIN][config_entry.entry_id][ "coordinator" ] # Get icon_color_mapping from config icon_color_mapping = config_entry.data.get("icon_color_mapping", "{}") # Create sensor entities entities = create_sensor_entities( coordinator, config_entry.entry_id, icon_color_mapping ) # Register all sensor entities with Home Assistant async_add_entities(entities) def create_sensor_entities(coordinator, entry_id, icon_color_mapping): """Create sensor entities based on coordinator data.""" entities = [] icon_color_map = load_icon_color_mapping(icon_color_mapping) for bin_type in coordinator.data.keys(): device_id = f"{entry_id}_{bin_type}" # Main bin sensor entities.append( UKBinCollectionDataSensor(coordinator, bin_type, device_id, icon_color_map) ) # Attribute sensors attributes = [ "Colour", "Next Collection Human Readable", "Days Until Collection", "Bin Type", "Next Collection Date", ] for attr in attributes: unique_id = f"{device_id}_{attr.lower().replace(' ', '_')}" entities.append( UKBinCollectionAttributeSensor( coordinator, bin_type, unique_id, attr, device_id, icon_color_map ) ) # Add the Raw JSON Sensor entities.append( UKBinCollectionRawJSONSensor(coordinator, f"{entry_id}_raw_json", entry_id) ) return entities def load_icon_color_mapping(icon_color_mapping: str) -> Dict[str, Any]: """Load and return the icon color mapping.""" try: return json.loads(icon_color_mapping) if icon_color_mapping else {} except JSONDecodeError: _LOGGER.warning( f"{LOG_PREFIX} Invalid icon_color_mapping JSON: {icon_color_mapping}. Using default settings." ) return {} class UKBinCollectionDataSensor(CoordinatorEntity, SensorEntity): """Sensor entity for individual bin collection data.""" _attr_device_class = DEVICE_CLASS def __init__( self, coordinator: DataUpdateCoordinator, bin_type: str, device_id: str, icon_color_mapping: Dict[str, Any], ) -> None: """Initialize the main bin sensor.""" super().__init__(coordinator) self.coordinator = coordinator self._bin_type = bin_type self._device_id = device_id self._icon_color_mapping = icon_color_mapping self._icon = self.get_icon() self._color = self.get_color() self._state = None self._next_collection = None self._days = None self.update_state() @property def device_info(self) -> dict: """Return device information for device registry.""" return { "identifiers": {(DOMAIN, self._device_id)}, "name": f"{self.coordinator.name} {self._bin_type}", "manufacturer": "UK Bin Collection", "model": "Bin Sensor", "sw_version": "1.0", } @callback def _handle_coordinator_update(self) -> None: """Handle updates from the coordinator and refresh sensor state.""" self.update_state() self.async_write_ha_state() def update_state(self) -> None: """Update the sensor's state and attributes.""" bin_date = self.coordinator.data.get(self._bin_type) if bin_date: self._next_collection = bin_date now = dt_util.now().date() self._days = (bin_date - now).days self._state = self.calculate_state() else: _LOGGER.warning( f"{LOG_PREFIX} Data for bin type '{self._bin_type}' is missing." ) self._state = "Unknown" self._days = None self._next_collection = None def calculate_state(self) -> str: """Determine the state based on collection date.""" now = dt_util.now().date() if self._next_collection == now: return "Today" elif self._next_collection == now + timedelta(days=1): return "Tomorrow" else: day_label = "day" if self._days == 1 else "days" return f"In {self._days} {day_label}" def get_icon(self) -> str: """Return the icon based on bin type or mapping.""" return self._icon_color_mapping.get(self._bin_type, {}).get( "icon", self.get_default_icon() ) def get_color(self) -> str: """Return the color based on bin type or mapping.""" color = self._icon_color_mapping.get(self._bin_type, {}).get("color") if color is None: return "black" return color def get_default_icon(self) -> str: """Return a default icon based on the bin type.""" bin_type_lower = self._bin_type.lower() if "recycling" in bin_type_lower: return "mdi:recycle" elif "waste" in bin_type_lower: return "mdi:trash-can" else: return "mdi:delete" @property def name(self) -> str: """Return the name of the sensor.""" return f"{self.coordinator.name} {self._bin_type}" @property def state(self) -> str: """Return the current state of the sensor.""" return self._state or "Unknown" @property def icon(self) -> str: """Return the icon for the sensor.""" return self._icon @property def extra_state_attributes(self) -> dict: """Return extra state attributes for the sensor.""" return { STATE_ATTR_COLOUR: self._color, STATE_ATTR_NEXT_COLLECTION: ( self._next_collection.strftime("%d/%m/%Y") if self._next_collection else None ), STATE_ATTR_DAYS: self._days, } @property def available(self) -> bool: """Return the availability of the sensor.""" return self._state != "Unknown" @property def unique_id(self) -> str: """Return a unique ID for the sensor.""" return self._device_id class UKBinCollectionAttributeSensor(CoordinatorEntity, SensorEntity): """Sensor entity for additional attributes of a bin.""" def __init__( self, coordinator: DataUpdateCoordinator, bin_type: str, unique_id: str, attribute_type: str, device_id: str, icon_color_mapping: Dict[str, Any], ) -> None: """Initialize the attribute sensor.""" super().__init__(coordinator) self.coordinator = coordinator self._bin_type = bin_type self._unique_id = unique_id self._attribute_type = attribute_type self._device_id = device_id self._icon_color_mapping = icon_color_mapping self._icon = self.get_icon() self._color = self.get_color() @property def name(self) -> str: """Return the name of the attribute sensor.""" return f"{self.coordinator.name} {self._bin_type} {self._attribute_type}" @property def state(self): """Return the state based on the attribute type.""" if self._attribute_type == "Colour": return self._color elif self._attribute_type == "Bin Type": return self._bin_type elif self._attribute_type == "Next Collection Date": bin_date = self.coordinator.data.get(self._bin_type) return bin_date.strftime("%d/%m/%Y") if bin_date else "Unknown" elif self._attribute_type == "Next Collection Human Readable": return self.calculate_human_readable() elif self._attribute_type == "Days Until Collection": return self.calculate_days_until() else: _LOGGER.warning( f"{LOG_PREFIX} Undefined attribute type: {self._attribute_type}" ) return "Undefined" def calculate_human_readable(self) -> str: """Calculate human-readable collection date.""" bin_date = self.coordinator.data.get(self._bin_type) if not bin_date: return "Unknown" now = dt_util.now().date() days = (bin_date - now).days if days == 0: return "Today" elif days == 1: return "Tomorrow" else: day_label = "day" if days == 1 else "days" return f"In {days} {day_label}" def calculate_days_until(self) -> int: """Calculate days until collection.""" bin_date = self.coordinator.data.get(self._bin_type) if not bin_date: return -1 return (bin_date - dt_util.now().date()).days def get_icon(self) -> str: """Return the icon based on bin type or mapping.""" return self._icon_color_mapping.get(self._bin_type, {}).get( "icon", self.get_default_icon() ) def get_color(self) -> str: """Return the color based on bin type or mapping.""" return self._icon_color_mapping.get(self._bin_type, {}).get("color", "black") def get_default_icon(self) -> str: """Return a default icon based on the bin type.""" bin_type_lower = self._bin_type.lower() if "recycling" in bin_type_lower: return "mdi:recycle" elif "waste" in bin_type_lower: return "mdi:trash-can" else: return "mdi:delete" @property def icon(self) -> str: """Return the icon for the attribute sensor.""" return self._icon @property def extra_state_attributes(self) -> dict: """Return the extra state attributes.""" return { STATE_ATTR_COLOUR: self._color, STATE_ATTR_NEXT_COLLECTION: self.coordinator.data.get(self._bin_type), } @property def device_info(self) -> dict: """Return device information for device registry.""" return { "identifiers": {(DOMAIN, self._device_id)}, "name": f"{self.coordinator.name} {self._bin_type}", "manufacturer": "UK Bin Collection", "model": "Bin Sensor", "sw_version": "1.0", } @property def unique_id(self) -> str: """Return a unique ID for the attribute sensor.""" return self._unique_id @property def available(self) -> bool: """Return the availability of the attribute sensor.""" return self.coordinator.last_update_success class UKBinCollectionRawJSONSensor(CoordinatorEntity, SensorEntity): """Sensor entity to hold the raw JSON data for bin collections.""" def __init__( self, coordinator: DataUpdateCoordinator, unique_id: str, name: str, ) -> None: """Initialize the raw JSON sensor.""" super().__init__(coordinator) self.coordinator = coordinator self._unique_id = unique_id self._name = f"{name} Raw JSON" @property def name(self) -> str: """Return the name of the raw JSON sensor.""" return self._name @property def state(self) -> str: """Return the raw JSON data as the state.""" if not self.coordinator.data: return "{}" data = { bin_type: bin_date.strftime("%d/%m/%Y") if bin_date else None for bin_type, bin_date in self.coordinator.data.items() } return json.dumps(data) @property def unique_id(self) -> str: """Return a unique ID for the raw JSON sensor.""" return self._unique_id @property def extra_state_attributes(self) -> dict: """Return the raw JSON data as an attribute.""" return {"raw_data": self.coordinator.data or {}} @property def available(self) -> bool: """Return the availability of the raw JSON sensor.""" return self.coordinator.last_update_success ================================================ FILE: custom_components/uk_bin_collection/services.yaml ================================================ manual_refresh: name: "Manual Refresh" description: "Manually refresh bin data for a specific config entry." fields: entry_id: name: "Entity ID" description: "Config Entry ID for the UK Bin Collection integration instance to refresh." example: "1234567890abcdef" ================================================ FILE: custom_components/uk_bin_collection/strings.json ================================================ { "title": "UK Bin Collection Data", "config": { "step": { "user": { "title": "Select the council", "data": { "name": "Location name", "council": "Council", "manual_refresh_only":"Automatically refresh the sensor", "icon_color_mapping": "JSON to map Bin Type for Colour and Icon (see documentation)" }, "description": "Please see the documentation if your council isn't listed" }, "council": { "title": "Provide council details", "data": { "url": "URL to fetch bin collection data", "timeout": "The time in seconds for how long the sensor should wait for data", "update_interval": "Time in hours between updates", "uprn": "UPRN (Unique Property Reference Number)", "postcode": "Postcode of the address", "number": "House number of the address", "usrn": "USRN (Unique Street Reference Number)", "web_driver": "To run on a remote Selenium Server add the Selenium Server URL", "headless": "Run Selenium in headless mode (recommended)", "local_browser": "Don't run on remote Selenium server, use local install of Chrome instead", "submit": "Submit" }, "description": "Please refer to your council's wiki entry for details on what to enter.\n{selenium_message}" }, "reconfigure_confirm": { "title": "Update council details", "data": { "url": "URL to fetch bin collection data", "timeout": "The time in seconds for how long the sensor should wait for data", "update_interval": "Time in hours between updates", "uprn": "UPRN (Unique Property Reference Number)", "postcode": "Postcode of the address", "number": "House number of the address", "usrn": "USRN (Unique Street Reference Number)", "web_driver": "To run on a remote Selenium Server add the Selenium Server URL", "headless": "Run Selenium in headless mode (recommended)", "local_browser": "Don't run on remote Selenium server, use local install of Chrome instead", "manual_refresh_only":"Automatically refresh the sensor", "icon_color_mapping": "JSON to map Bin Type for Colour and Icon (see documentation)", "submit": "Submit" }, "description": "Please refer to your council's wiki entry for details on what to enter." } }, "error": { "name": "Please enter a location name", "council": "Please select a council", "selenium_unavailable": "Selenium server is not accessible. Please ensure it is running at localhost:4444 or selenium:4444", "chromium_not_found": "Chromium browser is not installed. Please install Chromium or Google Chrome" } } } ================================================ FILE: custom_components/uk_bin_collection/tests/__init__.py ================================================ ================================================ FILE: custom_components/uk_bin_collection/tests/common_utils.py ================================================ # custom_components/uk_bin_collection/tests/common_utils.py import uuid from unittest.mock import Mock, AsyncMock # Import AsyncMock from homeassistant import config_entries import asyncio class MockConfigEntry: """Mock for Home Assistant ConfigEntry.""" def __init__( self, domain, data=None, options=None, title=None, unique_id=None, source=config_entries.SOURCE_USER, entry_id=None, version=1, ): """Initialize a mock config entry.""" self.domain = domain self.data = data or {} self.options = options or {} self.title = title or "Mock Title" self.unique_id = unique_id self.source = source self.entry_id = entry_id or uuid.uuid4().hex self.version = version self.state = config_entries.ConfigEntryState.NOT_LOADED def add_to_hass(self, hass): """Add the mock config entry to Home Assistant.""" # Mock the async_add method to accept the entry hass.config_entries.async_add.return_value = None hass.config_entries.async_add(self) # Mock async_setup to be an AsyncMock that returns True hass.config_entries.async_setup = AsyncMock(return_value=True) # Mock the create_task to immediately run the coroutine # Define a coroutine that runs async_setup and updates the entry state async def run_setup(entry_id): result = await hass.config_entries.async_setup(entry_id) if result: self.state = config_entries.ConfigEntryState.LOADED else: self.state = config_entries.ConfigEntryState.SETUP_ERROR # Assign the coroutine as a side effect to create_task hass.loop.create_task = AsyncMock( side_effect=lambda coro: asyncio.create_task(run_setup(self.entry_id)) ) ================================================ FILE: custom_components/uk_bin_collection/tests/test_calendar.py ================================================ # test_calendar.py """Unit tests for the UK Bin Collection Calendar platform.""" import pytest from unittest.mock import MagicMock, AsyncMock, patch from datetime import datetime, date, timedelta from homeassistant.core import HomeAssistant from homeassistant.helpers.update_coordinator import DataUpdateCoordinator from custom_components.uk_bin_collection.const import DOMAIN from custom_components.uk_bin_collection.calendar import ( UKBinCollectionCalendar, async_setup_entry, async_unload_entry, ) from homeassistant.components.calendar import CalendarEvent from .common_utils import MockConfigEntry pytest_plugins = ["freezegun"] # Mock Data MOCK_COORDINATOR_DATA = { "Recycling": date(2024, 4, 25), "General Waste": date(2024, 4, 26), "Garden Waste": date(2024, 4, 27), } @pytest.fixture def mock_coordinator(): """Fixture to create a mock DataUpdateCoordinator with sample data.""" coordinator = MagicMock(spec=DataUpdateCoordinator) coordinator.data = MOCK_COORDINATOR_DATA.copy() coordinator.name = "Test Council" coordinator.last_update_success = True return coordinator @pytest.fixture def mock_config_entry(): """Create a mock ConfigEntry.""" return MockConfigEntry( domain=DOMAIN, title="Test Entry", data={ "name": "Test Name", "council": "Test Council", "url": "https://example.com", "timeout": 60, "icon_color_mapping": {}, }, entry_id="test_entry_id", unique_id="test_unique_id", ) @pytest.fixture def hass_instance() -> HomeAssistant: """Return a fake HomeAssistant instance with a data attribute.""" hass = MagicMock(spec=HomeAssistant) # Ensure hass.data is a dict and contains a dict for our DOMAIN hass.data = {DOMAIN: {}} return hass # Tests def test_calendar_entity_initialization(hass_instance, mock_coordinator): """Test that the calendar entity initializes correctly.""" calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) assert calendar.name == "Test Council Recycling Calendar" assert calendar.unique_id == "test_entry_id_Recycling_calendar" assert calendar.device_info == { "identifiers": {(DOMAIN, "test_entry_id_Recycling_calendar")}, "name": "Test Council Recycling Calendar Device", "manufacturer": "UK Bin Collection", "model": "Bin Collection Calendar", "sw_version": "1.0", } def test_calendar_event_property(hass_instance, mock_coordinator): """Test that the event property returns the correct CalendarEvent.""" collection_date = date(2024, 4, 25) mock_coordinator.data["Recycling"] = collection_date calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) expected_event = CalendarEvent( summary="Recycling Collection", start=collection_date, end=collection_date + timedelta(days=1), uid="test_entry_id_Recycling_calendar_2024-04-25", ) assert calendar.event == expected_event def test_calendar_event_property_no_data(hass_instance, mock_coordinator): """Test that the event property returns None when there's no collection date.""" mock_coordinator.data["Recycling"] = None calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) assert calendar.event is None @pytest.mark.asyncio async def test_async_get_events(hass_instance, mock_coordinator): """Test that async_get_events returns correct events within the date range.""" mock_coordinator.data = { "Recycling": date(2024, 4, 25), "General Waste": date(2024, 4, 26), } calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) start_date = datetime(2024, 4, 24) end_date = datetime(2024, 4, 26) expected_event = CalendarEvent( summary="Recycling Collection", start=date(2024, 4, 25), end=date(2024, 4, 26), uid="test_entry_id_Recycling_calendar_2024-04-25", ) events = await calendar.async_get_events(hass_instance, start_date, end_date) assert events == [expected_event] @pytest.mark.asyncio async def test_async_get_events_no_events_in_range(hass_instance, mock_coordinator): """Test that async_get_events returns empty list when no events are in the range.""" mock_coordinator.data = { "Recycling": date(2024, 4, 25), } calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) start_date = datetime(2024, 4, 26) end_date = datetime(2024, 4, 30) events = await calendar.async_get_events(hass_instance, start_date, end_date) assert events == [] def test_calendar_update_on_coordinator_change(hass_instance, mock_coordinator): """Test that the calendar entity updates when the coordinator's data changes.""" collection_date_initial = date(2024, 4, 25) collection_date_updated = date(2024, 4, 26) mock_coordinator.data["Recycling"] = collection_date_initial calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) # Initially, the event should be for April 25 expected_event_initial = CalendarEvent( summary="Recycling Collection", start=collection_date_initial, end=collection_date_initial + timedelta(days=1), uid="test_entry_id_Recycling_calendar_2024-04-25", ) assert calendar.event == expected_event_initial # Update the coordinator's data mock_coordinator.data["Recycling"] = collection_date_updated mock_coordinator.async_write_ha_state = AsyncMock() # Simulate coordinator update by calling the update handler with patch.object(calendar, "async_write_ha_state", new=AsyncMock()) as mock_write: calendar._handle_coordinator_update() # The event should now be updated to April 26 expected_event_updated = CalendarEvent( summary="Recycling Collection", start=collection_date_updated, end=collection_date_updated + timedelta(days=1), uid="test_entry_id_Recycling_calendar_2024-04-26", ) assert calendar.event == expected_event_updated mock_write.assert_called_once() @pytest.mark.asyncio async def test_async_setup_entry_creates_calendar_entities( hass_instance, mock_coordinator, mock_config_entry ): """Test that async_setup_entry creates calendar entities based on coordinator data.""" # Mock the data in the coordinator mock_coordinator.data = { "Recycling": date(2024, 4, 25), "General Waste": date(2024, 4, 26), } # Patch the hass.data to include the coordinator hass_instance.data[DOMAIN][mock_config_entry.entry_id] = { "coordinator": mock_coordinator, } with patch( "custom_components.uk_bin_collection.calendar.UKBinCollectionCalendar", autospec=True, ) as mock_calendar_cls: mock_calendar_instance_recycling = MagicMock() mock_calendar_instance_general_waste = MagicMock() mock_calendar_cls.side_effect = [ mock_calendar_instance_recycling, mock_calendar_instance_general_waste, ] await async_setup_entry(hass_instance, mock_config_entry, lambda entities: None) # Ensure that two calendar entities are created assert mock_calendar_cls.call_count == 2 # Verify that the calendar entities are initialized with correct parameters mock_calendar_cls.assert_any_call( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) mock_calendar_cls.assert_any_call( coordinator=mock_coordinator, bin_type="General Waste", unique_id="test_entry_id_General Waste_calendar", name="Test Council General Waste Calendar", ) @pytest.mark.asyncio async def test_async_setup_entry_handles_empty_data(hass_instance, mock_config_entry): """Test that async_setup_entry handles empty coordinator data gracefully.""" # Mock an empty data coordinator mock_coordinator = MagicMock(spec=DataUpdateCoordinator) mock_coordinator.data = {} mock_coordinator.name = "Test Council" mock_coordinator.last_update_success = True # Patch the hass.data to include the coordinator hass_instance.data[DOMAIN][mock_config_entry.entry_id] = { "coordinator": mock_coordinator, } with patch( "custom_components.uk_bin_collection.calendar.UKBinCollectionCalendar", autospec=True, ) as mock_calendar_cls: await async_setup_entry(hass_instance, mock_config_entry, lambda entities: None) # No calendar entities should be created since there's no data mock_calendar_cls.assert_not_called() @pytest.mark.asyncio async def test_async_setup_entry_handles_coordinator_failure( hass_instance, mock_config_entry ): """Test that async_setup_entry raises ConfigEntryNotReady on coordinator failure.""" mock_coordinator = MagicMock(spec=DataUpdateCoordinator) mock_coordinator.async_config_entry_first_refresh.side_effect = Exception( "Update failed" ) mock_coordinator.name = "Test Council" # Patch the hass.data to include the coordinator hass_instance.data[DOMAIN][mock_config_entry.entry_id] = { "coordinator": mock_coordinator, } with pytest.raises(Exception, match="Update failed"): await async_setup_entry(hass_instance, mock_config_entry, lambda entities: None) @pytest.mark.asyncio async def test_async_unload_entry(hass_instance, mock_coordinator, mock_config_entry): """Test that async_unload_entry unloads calendar entities correctly.""" # Prepare the coordinator data mock_coordinator.data = {"Recycling": date(2024, 4, 25)} mock_coordinator.name = "Test Council" hass_instance.data[DOMAIN][mock_config_entry.entry_id] = { "coordinator": mock_coordinator } result = await async_unload_entry(hass_instance, mock_config_entry, None) assert result is True def test_calendar_entity_available_property(hass_instance, mock_coordinator): """Test the available property of the calendar entity.""" # When data is present and last_update_success is True mock_coordinator.last_update_success = True mock_coordinator.data["Recycling"] = date(2024, 4, 25) calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) assert calendar.available is True # When data is missing mock_coordinator.data["Recycling"] = None assert calendar.available is False # When last_update_success is False mock_coordinator.last_update_success = False calendar._state = "Unknown" # Assuming state is set to "Unknown" when unavailable assert calendar.available is False @pytest.mark.asyncio async def test_async_setup_entry_creates_no_calendar_entities_on_empty_data( hass_instance, mock_config_entry ): """Test that async_setup_entry does not create calendar entities when coordinator data is empty.""" mock_coordinator = MagicMock(spec=DataUpdateCoordinator) mock_coordinator.data = {} mock_coordinator.name = "Test Council" mock_coordinator.last_update_success = True # Patch the hass.data to include the coordinator hass_instance.data[DOMAIN][mock_config_entry.entry_id] = { "coordinator": mock_coordinator, } with patch( "custom_components.uk_bin_collection.calendar.UKBinCollectionCalendar", autospec=True, ) as mock_calendar_cls: await async_setup_entry(hass_instance, mock_config_entry, lambda entities: None) # No calendar entities should be created mock_calendar_cls.assert_not_called() @pytest.mark.asyncio async def test_async_setup_entry_with_coordinator_failure( hass_instance, mock_config_entry ): """Test that async_setup_entry handles coordinator failures gracefully.""" mock_coordinator = MagicMock(spec=DataUpdateCoordinator) mock_coordinator.async_config_entry_first_refresh.side_effect = Exception( "Update failed" ) mock_coordinator.name = "Test Council" # Patch the hass.data to include the coordinator hass_instance.data[DOMAIN][mock_config_entry.entry_id] = { "coordinator": mock_coordinator, } with pytest.raises(Exception, match="Update failed"): await async_setup_entry(hass_instance, mock_config_entry, lambda entities: None) @pytest.mark.asyncio async def test_async_setup_entry_handles_coordinator_failure( hass_instance, mock_config_entry ): """Test that async_setup_entry raises an exception when coordinator refresh fails.""" mock_coordinator = MagicMock(spec=DataUpdateCoordinator) # Provide an empty data dictionary so that accessing .data does not fail immediately. mock_coordinator.data = {} mock_coordinator.name = "Test Council" # Make the refresh raise an exception. mock_coordinator.async_config_entry_first_refresh = AsyncMock( side_effect=Exception("Update failed") ) hass_instance.data[DOMAIN][mock_config_entry.entry_id] = { "coordinator": mock_coordinator } with pytest.raises(Exception, match="Update failed"): await async_setup_entry(hass_instance, mock_config_entry, lambda entities: None) @pytest.mark.asyncio async def test_async_get_events_multiple_events_same_day( hass_instance, mock_coordinator ): """Test async_get_events when multiple bin types have the same collection date.""" mock_coordinator.data = { "Recycling": date(2024, 4, 25), "General Waste": date(2024, 4, 25), } calendar_recycling = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) calendar_general_waste = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="General Waste", unique_id="test_entry_id_General Waste_calendar", name="Test Council General Waste Calendar", ) start_date = datetime(2024, 4, 24) end_date = datetime(2024, 4, 26) expected_event_recycling = CalendarEvent( summary="Recycling Collection", start=date(2024, 4, 25), end=date(2024, 4, 26), uid="test_entry_id_Recycling_calendar_2024-04-25", ) expected_event_general_waste = CalendarEvent( summary="General Waste Collection", start=date(2024, 4, 25), end=date(2024, 4, 26), uid="test_entry_id_General Waste_calendar_2024-04-25", ) events_recycling = await calendar_recycling.async_get_events( hass_instance, start_date, end_date ) events_general_waste = await calendar_general_waste.async_get_events( hass_instance, start_date, end_date ) assert events_recycling == [expected_event_recycling] assert events_general_waste == [expected_event_general_waste] @pytest.mark.asyncio async def test_async_get_events_no_coordinator_data(hass_instance, mock_coordinator): """Test async_get_events when coordinator has no data.""" mock_coordinator.data = {} calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) start_date = datetime(2024, 4, 24) end_date = datetime(2024, 4, 26) events = await calendar.async_get_events(hass_instance, start_date, end_date) assert events == [] def test_calendar_entity_available_property_no_data(hass_instance, mock_coordinator): """Test that the calendar's available property is False when there's no data.""" mock_coordinator.data["Recycling"] = None calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) assert calendar.available is False @pytest.mark.asyncio async def test_calendar_entity_extra_state_attributes(hass_instance, mock_coordinator): """Test the extra_state_attributes property of the calendar entity.""" mock_coordinator.data["Recycling"] = date(2024, 4, 25) calendar = UKBinCollectionCalendar( coordinator=mock_coordinator, bin_type="Recycling", unique_id="test_entry_id_Recycling_calendar", name="Test Council Recycling Calendar", ) # Assuming extra_state_attributes includes more data if implemented # Adjust this part based on your actual calendar.py implementation # For example, you might include 'next_collection_date' and 'days_until_collection' # Here, we'll assume no additional attributes as per the initial calendar.py # If extra_state_attributes is not implemented, it defaults to None # To handle this, you can set it to return an empty dict if not implemented assert calendar.extra_state_attributes == {} @pytest.mark.asyncio async def test_async_setup_entry_handles_coordinator_partial_data( hass_instance, mock_config_entry ): """Test that async_setup_entry creates calendar entities only for available data.""" mock_coordinator = MagicMock(spec=DataUpdateCoordinator) mock_coordinator.data = { "Recycling": date(2024, 4, 25), "General Waste": None, # No collection date "Garden Waste": date(2024, 4, 27), } mock_coordinator.name = "Test Council" mock_coordinator.async_config_entry_first_refresh = AsyncMock(return_value=None) hass_instance.data[DOMAIN][mock_config_entry.entry_id] = { "coordinator": mock_coordinator } with patch( "custom_components.uk_bin_collection.calendar.UKBinCollectionCalendar", autospec=True, ) as mock_calendar_cls: mock_calendar_instance_recycling = MagicMock() mock_calendar_instance_garden_waste = MagicMock() mock_calendar_cls.side_effect = [ mock_calendar_instance_recycling, mock_calendar_instance_garden_waste, ] await async_setup_entry(hass_instance, mock_config_entry, lambda entities: None) # Ensure that two calendar entities are created (skipping "General Waste") assert mock_calendar_cls.call_count == 2 # Verify that the calendar entities are initialized with correct parameters mock_calendar_cls.assert_any_call( coordinator=mock_coordinator, bin_type="Recycling", unique_id="{}_{bin}_calendar".format( mock_config_entry.entry_id, bin="Recycling" ), name="Test Council Recycling Calendar", ) mock_calendar_cls.assert_any_call( coordinator=mock_coordinator, bin_type="Garden Waste", unique_id="{}_{bin}_calendar".format( mock_config_entry.entry_id, bin="Garden Waste" ), name="Test Council Garden Waste Calendar", ) ================================================ FILE: custom_components/uk_bin_collection/tests/test_config_flow.py ================================================ # test_config_flow.py """Test UkBinCollection config flow.""" import asyncio import json from datetime import date, datetime, timedelta from json import JSONDecodeError from unittest.mock import AsyncMock, MagicMock, patch import aiohttp import pytest import voluptuous as vol from homeassistant import config_entries, data_entry_flow from homeassistant.const import CONF_NAME, CONF_URL from homeassistant.core import HomeAssistant from homeassistant.exceptions import ConfigEntryNotReady from custom_components.uk_bin_collection.config_flow import ( UkBinCollectionConfigFlow, UkBinCollectionOptionsFlowHandler, async_get_options_flow, ) from custom_components.uk_bin_collection.const import DOMAIN, LOG_PREFIX from custom_components.uk_bin_collection.sensor import load_icon_color_mapping from .common_utils import MockConfigEntry @pytest.fixture def hass_with_loop(hass, event_loop): hass.loop = event_loop return hass # Mock council data representing different scenarios MOCK_COUNCILS_DATA = { "CouncilTest": { "wiki_name": "Council Test", "uprn": True, "url": "https://example.com/council_test", "skip_get_url": False, }, "CouncilSkip": { "wiki_name": "Council Skip URL", "skip_get_url": True, "url": "https://example.com/skip", }, "CouncilWithoutURL": { "wiki_name": "Council without URL", "skip_get_url": True, # Do not include 'custom_component_show_url_field' # Other necessary fields "uprn": True, "url": "https://example.com/council_without_url", }, "CouncilWithUSRN": { "wiki_name": "Council with USRN", "usrn": True, }, "CouncilWithUPRN": { "wiki_name": "Council with UPRN", "uprn": True, }, "CouncilWithPostcodeNumber": { "wiki_name": "Council with Postcode and Number", "postcode": True, "house_number": True, }, "CouncilWithWebDriver": { "wiki_name": "Council with Web Driver", "web_driver": True, }, "CouncilSkippingURL": { "wiki_name": "Council skipping URL", "skip_get_url": True, "url": "https://council.example.com", }, "CouncilCustomURLField": { "wiki_name": "Council with Custom URL Field", "custom_component_show_url_field": True, }, # Add more mock councils as needed to cover different scenarios } # Create a dummy HomeAssistant object. class DummyHass: def __init__(self, loop): self.data = {} self.config_entries = MagicMock() self.config_entries.async_update_entry = AsyncMock() self.config_entries.async_reload = AsyncMock() self.loop = loop @pytest.fixture def dummy_hass(event_loop): return DummyHass(event_loop) # A sample councils data for the options flow tests. MOCK_COUNCILS_DATA_OPTIONS = { "CouncilTest": { "wiki_name": "Council Test", "uprn": True, "url": "https://example.com/council_test", } } @pytest.fixture def options_flow(dummy_hass): """Create an instance of the options flow with a dummy config entry.""" config_entry = MockConfigEntry( domain=DOMAIN, data={ "name": "Test Options", "council": "CouncilTest", "update_interval": 12, "icon_color_mapping": '{"CouncilTest": {"icon": "mdi:trash", "color": "green"}}', }, entry_id="options_test", unique_id="options_unique", ) config_entry.add_to_hass(dummy_hass) flow = UkBinCollectionOptionsFlowHandler(config_entry) flow.hass = dummy_hass return flow, config_entry # Dummy config entry class for testing. class DummyEntry: def __init__(self, data, entry_id="dummy"): self.data = data self.entry_id = entry_id self.title = data.get("name", "") # Helper function to initiate the config flow and proceed through steps async def proceed_through_config_flow( hass: HomeAssistant, flow, user_input_initial, user_input_council ): # Start the flow and complete the `user` step result = await flow.async_step_user(user_input=user_input_initial) assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "council" # Complete the `council` step result = await flow.async_step_council(user_input=user_input_council) return result @pytest.mark.asyncio async def test_config_flow_with_uprn(hass: HomeAssistant): """Test config flow for a council requiring UPRN.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council with UPRN", } user_input_council = { "uprn": "1234567890", "timeout": 60, } result = await proceed_through_config_flow( hass, flow, user_input_initial, user_input_council ) assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilWithUPRN", "uprn": "1234567890", "timeout": 60, } async def test_config_flow_with_postcode_and_number(hass: HomeAssistant): """Test config flow for a council requiring postcode and house number.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council with Postcode and Number", } user_input_council = { "postcode": "AB1 2CD", "number": "42", "timeout": 60, } result = await proceed_through_config_flow( hass, flow, user_input_initial, user_input_council ) assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilWithPostcodeNumber", "postcode": "AB1 2CD", "number": "42", "timeout": 60, } async def test_config_flow_with_web_driver(hass: HomeAssistant): """Test config flow for a council requiring web driver.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council with Web Driver", } user_input_council = { "web_driver": "/path/to/webdriver", "headless": True, "local_browser": False, "timeout": 60, } result = await proceed_through_config_flow( hass, flow, user_input_initial, user_input_council ) assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilWithWebDriver", "web_driver": "/path/to/webdriver", "headless": True, "local_browser": False, "timeout": 60, } async def test_config_flow_skipping_url(hass: HomeAssistant): """Test config flow for a council that skips URL input.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council skipping URL", } user_input_council = { "timeout": 60, } result = await proceed_through_config_flow( hass, flow, user_input_initial, user_input_council ) assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilSkippingURL", "skip_get_url": True, "url": "https://council.example.com", "timeout": 60, } async def test_config_flow_with_custom_url_field(hass: HomeAssistant): """Test config flow for a council with custom URL field.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council with Custom URL Field", } user_input_council = { "url": "https://custom-url.example.com", "timeout": 60, } result = await proceed_through_config_flow( hass, flow, user_input_initial, user_input_council ) assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilCustomURLField", "url": "https://custom-url.example.com", "timeout": 60, } async def test_config_flow_missing_name(hass: HomeAssistant): """Test config flow when name is missing.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "", # Missing name "council": "Council with UPRN", } result = await flow.async_step_user(user_input=user_input_initial) assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "user" assert result["errors"] == {"name": "Name is required."} async def test_config_flow_invalid_icon_color_mapping(hass: HomeAssistant): """Test config flow with invalid icon_color_mapping JSON.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council with UPRN", "icon_color_mapping": "invalid json", # Invalid JSON } result = await flow.async_step_user(user_input=user_input_initial) # Should return to the user step with an error assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "user" assert result["errors"] == {"icon_color_mapping": "Invalid JSON format."} async def test_config_flow_with_usrn(hass: HomeAssistant): """Test config flow for a council requiring USRN.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council with USRN", } user_input_council = { "usrn": "9876543210", "timeout": 60, } result = await proceed_through_config_flow( hass, flow, user_input_initial, user_input_council ) assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilWithUSRN", "usrn": "9876543210", "timeout": 60, } @pytest.mark.asyncio async def test_reconfigure_flow(hass): """Test reconfiguration of an existing integration.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): # Create an existing entry existing_entry = MockConfigEntry( domain=DOMAIN, data={ "name": "Existing Entry", "council": "CouncilWithUPRN", "uprn": "1234567890", "timeout": 60, }, ) existing_entry.add_to_hass(hass) # Configure async_get_entry to return the existing_entry when called with its entry_id hass.config_entries.async_get_entry.return_value = existing_entry # Configure async_init to return a FlowResultType.FORM with step_id 'reconfigure_confirm' hass.config_entries.flow.async_init.return_value = { "flow_id": "test_flow_id", "type": data_entry_flow.RESULT_TYPE_FORM, "step_id": "reconfigure_confirm", } # Initialize the flow flow = UkBinCollectionConfigFlow() flow.hass = hass # Set the context to reconfigure the existing entry flow.context = {"source": "reconfigure", "entry_id": existing_entry.entry_id} # Mock async_step_reconfigure_confirm's behavior with patch.object( flow, "async_step_reconfigure_confirm", new=AsyncMock() ) as mock_step: mock_step.return_value = { "type": data_entry_flow.RESULT_TYPE_CREATE_ENTRY, "title": "Test Name", "data": { "name": "Test Name", "council": "CouncilWithUPRN", "uprn": "0987654321", "timeout": 120, }, } # Start the reconfiguration flow result = await flow.async_step_reconfigure() assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilWithUPRN", "uprn": "0987654321", "timeout": 120, } # Verify that async_step_reconfigure_confirm was called mock_step.assert_called_once() async def get_councils_json(self) -> object: """Returns an object of supported councils and their required fields.""" url = "https://raw.githubusercontent.com/robbrad/UKBinCollectionData/0.104.0/uk_bin_collection/tests/input.json" try: async with aiohttp.ClientSession() as session: async with session.get(url) as response: data_text = await response.text() return json.loads(data_text) except Exception as e: _LOGGER.error("Failed to fetch councils data: %s", e) return {} @pytest.mark.asyncio async def test_get_councils_json_failure(hass: HomeAssistant): """Test handling when get_councils_json fails.""" with patch( "aiohttp.ClientSession", autospec=True, ) as mock_session_cls: # Configure the mock session to simulate a network error mock_session = mock_session_cls.return_value.__aenter__.return_value mock_session.get.side_effect = Exception("Network error") # Configure async_init to simulate flow abort due to council data being unavailable hass.config_entries.flow.async_init.return_value = { "type": data_entry_flow.RESULT_TYPE_ABORT, "reason": "council_data_unavailable", } # Initialize the flow flow = UkBinCollectionConfigFlow() flow.hass = hass # Start the flow using hass.config_entries.flow.async_init result = await hass.config_entries.flow.async_init( DOMAIN, context={"source": config_entries.SOURCE_USER} ) # The flow should abort due to council data being unavailable assert result["type"] == data_entry_flow.FlowResultType.ABORT assert result["reason"] == "council_data_unavailable" async def test_config_flow_user_input_none(hass: HomeAssistant): """Test config flow when user_input is None.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass result = await flow.async_step_user(user_input=None) assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "user" async def test_config_flow_with_optional_fields(hass: HomeAssistant): """Test config flow with optional fields provided.""" # Assume 'CouncilWithOptionalFields' requires 'uprn' and has optional 'web_driver' MOCK_COUNCILS_DATA["CouncilWithOptionalFields"] = { "wiki_name": "Council with Optional Fields", "uprn": True, "web_driver": True, } with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council with Optional Fields", } user_input_council = { "uprn": "1234567890", "web_driver": "/path/to/webdriver", "headless": True, "local_browser": False, "timeout": 60, } result = await proceed_through_config_flow( hass, flow, user_input_initial, user_input_council ) assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilWithOptionalFields", "uprn": "1234567890", "web_driver": "/path/to/webdriver", "headless": True, "local_browser": False, "timeout": 60, } @pytest.mark.asyncio async def test_get_councils_json_session_creation_failure(hass): """Test handling when creating aiohttp ClientSession fails.""" with patch( "aiohttp.ClientSession", side_effect=Exception("Failed to create session"), ): flow = UkBinCollectionConfigFlow() flow.hass = hass # Configure async_init to simulate flow abort due to council data being unavailable hass.config_entries.flow.async_init.return_value = { "type": data_entry_flow.RESULT_TYPE_ABORT, "reason": "council_data_unavailable", } # Start the flow using hass.config_entries.flow.async_init result = await hass.config_entries.flow.async_init( DOMAIN, context={"source": config_entries.SOURCE_USER} ) # The flow should abort due to council data being unavailable assert result["type"] == data_entry_flow.FlowResultType.ABORT assert result["reason"] == "council_data_unavailable" @pytest.mark.asyncio async def test_config_flow_council_without_url(hass): """Test config flow for a council where 'url' field should not be included.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "Council without URL", } user_input_council = { "uprn": "1234567890", "timeout": 60, } # Configure async_init to return a FlowResultType.FORM with step_id 'council' hass.config_entries.flow.async_init.return_value = { "flow_id": "test_flow_id", "type": data_entry_flow.RESULT_TYPE_FORM, "step_id": "council", } # Configure async_configure to return a FlowResultType.CREATE_ENTRY hass.config_entries.flow.async_configure.return_value = { "type": data_entry_flow.RESULT_TYPE_CREATE_ENTRY, "title": "Test Name", "data": { "name": "Test Name", "council": "CouncilWithoutURL", "uprn": "1234567890", "timeout": 60, "skip_get_url": True, "url": "https://example.com/council_without_url", }, } # Start the flow result = await hass.config_entries.flow.async_init( DOMAIN, context={"source": config_entries.SOURCE_USER} ) # Provide initial user input result = await hass.config_entries.flow.async_configure( result["flow_id"], user_input=user_input_initial ) assert result["type"] == data_entry_flow.FlowResultType.CREATE_ENTRY assert result["title"] == "Test Name" assert result["data"] == { "name": "Test Name", "council": "CouncilWithoutURL", "uprn": "1234567890", "timeout": 60, "skip_get_url": True, "url": "https://example.com/council_without_url", } async def test_config_flow_missing_council(hass: HomeAssistant): """Test config flow when council is missing.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass user_input_initial = { "name": "Test Name", "council": "", # Missing council } result = await flow.async_step_user(user_input=user_input_initial) # Should return to the user step with an error assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "user" assert result["errors"] == {"council": "Council is required."} @pytest.mark.asyncio async def test_reconfigure_flow_with_errors(hass): """Test reconfiguration with invalid input.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): # Create an existing entry existing_entry = MockConfigEntry( domain=DOMAIN, data={ "name": "Existing Entry", "council": "CouncilWithUPRN", "uprn": "1234567890", "timeout": 60, }, ) existing_entry.add_to_hass(hass) # Configure async_get_entry to return the existing_entry when called with its entry_id hass.config_entries.async_get_entry.return_value = existing_entry # Configure async_init to return a FlowResultType.FORM with step_id 'reconfigure_confirm' hass.config_entries.flow.async_init.return_value = { "flow_id": "test_flow_id", "type": data_entry_flow.RESULT_TYPE_FORM, "step_id": "reconfigure_confirm", } # Initialize the flow flow = UkBinCollectionConfigFlow() flow.hass = hass # Set the context to reconfigure the existing entry flow.context = {"source": "reconfigure", "entry_id": existing_entry.entry_id} # Mock async_step_reconfigure_confirm's behavior to handle invalid input with patch.object( flow, "async_step_reconfigure_confirm", new=AsyncMock() ) as mock_step: mock_step.return_value = { "type": data_entry_flow.RESULT_TYPE_FORM, "step_id": "reconfigure_confirm", "errors": {"icon_color_mapping": "invalid_json"}, } # Start the reconfiguration flow result = await flow.async_step_reconfigure() assert result["type"] == data_entry_flow.FlowResultType.FORM assert result["step_id"] == "reconfigure_confirm" # Provide invalid data (e.g., invalid JSON for icon_color_mapping) user_input = { "name": "Updated Entry", "council": "Council with UPRN", "uprn": "0987654321", "icon_color_mapping": "invalid json", "timeout": 60, } # Configure async_configure to return an error hass.config_entries.flow.async_configure.return_value = { "type": data_entry_flow.RESULT_TYPE_FORM, "step_id": "reconfigure_confirm", "errors": {"icon_color_mapping": "invalid_json"}, } result = await flow.async_step_reconfigure_confirm(user_input=user_input) # Should return to the reconfigure_confirm step with an error assert result["type"] == data_entry_flow.FlowResultType.FORM assert result["step_id"] == "reconfigure_confirm" assert result["errors"] == {"icon_color_mapping": "invalid_json"} @pytest.mark.asyncio async def test_reconfigure_flow_entry_missing(hass): """Test reconfiguration when the config entry is missing.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass # Set the context with an invalid entry_id to simulate a missing entry flow.context = {"source": "reconfigure", "entry_id": "invalid_entry_id"} # Mock async_get_entry to return None using MagicMock, not AsyncMock hass.config_entries.async_get_entry = MagicMock(return_value=None) # Run the reconfiguration step to check for abort result = await flow.async_step_reconfigure() # Assert that the flow aborts due to the missing config entry assert result["type"] == data_entry_flow.FlowResultType.ABORT assert result["reason"] == "Reconfigure Failed" @pytest.mark.asyncio async def test_reconfigure_flow_no_user_input(hass): """Test reconfiguration when user_input is None.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): # Create a mock entry and ensure add_to_hass is awaited existing_entry = MockConfigEntry( domain=DOMAIN, data={ "name": "Existing Entry", "council": "CouncilWithUPRN", "uprn": "1234567890", "timeout": 60, }, ) existing_entry.add_to_hass(hass) # Mock async_get_entry to return the entry directly, avoiding coroutine issues hass.config_entries.async_get_entry = AsyncMock(return_value=existing_entry) # Mock async_init and start the reconfigure flow hass.config_entries.flow.async_init.return_value = { "flow_id": "test_flow_id", "type": data_entry_flow.RESULT_TYPE_FORM, "step_id": "reconfigure_confirm", } flow = UkBinCollectionConfigFlow() flow.hass = hass flow.context = {"source": "reconfigure", "entry_id": existing_entry.entry_id} # Proceed without user input, simulating the form return with patch.object( flow, "async_step_reconfigure_confirm", new=AsyncMock() ) as mock_step: mock_step.return_value = { "type": data_entry_flow.RESULT_TYPE_FORM, "step_id": "reconfigure_confirm", "errors": {}, } result = await flow.async_step_reconfigure_confirm(user_input=None) assert result["type"] == data_entry_flow.FlowResultType.FORM assert result["step_id"] == "reconfigure_confirm" @pytest.mark.asyncio async def test_check_selenium_server_exception(hass: HomeAssistant): """Test exception handling in check_selenium_server.""" with patch( "aiohttp.ClientSession.get", side_effect=Exception("Connection error"), ): flow = UkBinCollectionConfigFlow() flow.hass = hass result = await flow.check_selenium_server() # Expected result is that all URLs are marked as not accessible expected_result = [ ("http://localhost:4444", False), ("http://selenium:4444", False), ] assert result == expected_result @pytest.mark.asyncio async def test_get_councils_json_exception(hass: HomeAssistant): """Test exception handling in get_councils_json.""" with patch( "aiohttp.ClientSession.get", side_effect=Exception("Network error"), ): flow = UkBinCollectionConfigFlow() flow.hass = hass result = await flow.get_councils_json() assert result == {} @pytest.mark.asyncio async def test_async_step_user_council_data_unavailable(hass: HomeAssistant): """Test async_step_user when council data is unavailable.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=None, ): flow = UkBinCollectionConfigFlow() flow.hass = hass result = await flow.async_step_user(user_input={}) assert result["type"] == data_entry_flow.FlowResultType.ABORT assert result["reason"] == "Council Data Unavailable" @pytest.mark.asyncio async def test_async_step_council_invalid_icon_color_mapping(hass: HomeAssistant): """Test async_step_council with invalid JSON in icon_color_mapping.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass flow.data = { "name": "Test Name", "council": "CouncilWithUPRN", } flow.councils_data = MOCK_COUNCILS_DATA user_input = { "uprn": "1234567890", "icon_color_mapping": "invalid json", "timeout": 60, } result = await flow.async_step_council(user_input=user_input) assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "council" assert result["errors"] == {"icon_color_mapping": "Invalid JSON format."} @pytest.mark.asyncio async def test_async_step_reconfigure_entry_none(hass: HomeAssistant): """Test async_step_reconfigure when config entry is None.""" flow = UkBinCollectionConfigFlow() flow.hass = hass flow.context = {"entry_id": "non_existent_entry_id"} # Mock async_get_entry to return None flow.hass.config_entries.async_get_entry = MagicMock(return_value=None) result = await flow.async_step_reconfigure() assert result["type"] == data_entry_flow.FlowResultType.ABORT assert result["reason"] == "Reconfigure Failed" async def test_async_step_reconfigure_confirm_user_input_none(hass: HomeAssistant): flow = UkBinCollectionConfigFlow() flow.hass = hass # Create a mock config entry config_entry = MockConfigEntry( domain=DOMAIN, data={ "name": "Test Name", "council": "CouncilWithUPRN", "uprn": "1234567890", "timeout": 60, }, ) config_entry.add_to_hass(hass) flow.config_entry = config_entry flow.context = {"entry_id": config_entry.entry_id} flow.councils_data = MOCK_COUNCILS_DATA # Patch async_get_entry to return the config_entry immediately hass.config_entries.async_get_entry = MagicMock(return_value=config_entry) result = await flow.async_step_reconfigure_confirm(user_input=None) assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "reconfigure_confirm" @pytest.mark.asyncio async def test_async_step_council_missing_council_key(hass: HomeAssistant): """Test async_step_council when council_key is missing in councils_data.""" flow = UkBinCollectionConfigFlow() flow.hass = hass flow.data = { "name": "Test Name", "council": "NonExistentCouncil", } flow.councils_data = MOCK_COUNCILS_DATA result = await flow.async_step_council(user_input=None) assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "council" @pytest.mark.asyncio async def test_check_chromium_installed_exception(hass: HomeAssistant): """Test exception handling in check_chromium_installed.""" with patch( "shutil.which", side_effect=Exception("Filesystem error"), ): flow = UkBinCollectionConfigFlow() flow.hass = hass result = await flow.check_chromium_installed() assert result is False async def test_async_step_reconfigure_confirm_invalid_json(hass: HomeAssistant): with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass # Create a mock config entry config_entry = MockConfigEntry( domain=DOMAIN, data={ "name": "Existing Entry", "council": "CouncilWithUPRN", "uprn": "1234567890", "timeout": 60, }, ) config_entry.add_to_hass(hass) flow.config_entry = config_entry flow.context = {"entry_id": config_entry.entry_id} # Patch async_get_entry to return the config_entry (not a coroutine) hass.config_entries.async_get_entry = MagicMock(return_value=config_entry) # Set up mocks for async methods hass.config_entries.async_reload = AsyncMock() hass.config_entries.async_update_entry = MagicMock() user_input = { "name": "Updated Entry", "council": "Council with UPRN", "icon_color_mapping": "invalid json", "uprn": "0987654321", "timeout": 120, } result = await flow.async_step_reconfigure_confirm(user_input=user_input) # Should return to the reconfigure_confirm step with an error assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "reconfigure_confirm" assert result["errors"] == {"icon_color_mapping": "Invalid JSON format."} @pytest.mark.asyncio async def test_config_flow_with_manual_refresh_only(hass: HomeAssistant): """Test config flow when the user selects manual_refresh_only = True.""" mock_councils = { "CouncilWithUPRN": { "wiki_name": "Council with UPRN", "uprn": True, } } with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=mock_councils, ): flow = UkBinCollectionConfigFlow() flow.hass = hass # Step 1: user selects council + sets manual_refresh_only user_input_initial = { "name": "Test Manual Refresh", "council": "Council with UPRN", "manual_refresh_only": True, # icon_color_mapping, etc. are optional } # Step 2: council details # minimal fields needed for council requiring UPRN user_input_council = { "uprn": "1234567890", "timeout": 45, # note that if skip_get_url is False, you might need "url" or not } # Start user step result = await flow.async_step_user(user_input=user_input_initial) assert result["type"] == data_entry_flow.RESULT_TYPE_FORM assert result["step_id"] == "council" # Complete council step result = await flow.async_step_council(user_input=user_input_council) assert result["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY assert result["title"] == "Test Manual Refresh" # Confirm the config entry data now includes manual_refresh_only assert result["data"] == { "name": "Test Manual Refresh", "council": "CouncilWithUPRN", "uprn": "1234567890", "timeout": 45, "manual_refresh_only": True, } # --------------------------- # Tests for helper functions # --------------------------- def test_load_icon_color_mapping_valid(): valid_json = '{"General Waste": {"icon": "mdi:trash-can", "color": "brown"}}' result = load_icon_color_mapping(valid_json) assert isinstance(result, dict) assert result["General Waste"]["icon"] == "mdi:trash-can" assert result["General Waste"]["color"] == "brown" def test_load_icon_color_mapping_invalid(): invalid_json = '{"icon":"mdi:trash" "no_comma":true}' # missing comma with patch("logging.Logger.warning") as mock_warn: result = load_icon_color_mapping(invalid_json) assert result == {} mock_warn.assert_called_once_with( f"{LOG_PREFIX} Invalid icon_color_mapping JSON: {invalid_json}. Using default settings." ) def test_map_wiki_name_to_council_key(): flow = UkBinCollectionConfigFlow() flow.council_names = ["CouncilTest"] flow.council_options = ["Council Test"] # Valid mapping key = flow.map_wiki_name_to_council_key("Council Test") assert key == "CouncilTest" # Invalid mapping: expect empty string and a logged error. with patch("logging.Logger.error") as mock_error: key_invalid = flow.map_wiki_name_to_council_key("Not Exist") assert key_invalid == "" mock_error.assert_called_once_with( "Wiki name '%s' not found in council options.", "Not Exist" ) def test_is_valid_json(): valid = '{"key": "value"}' invalid = '{"key": "value",}' # trailing comma makes it invalid assert UkBinCollectionConfigFlow.is_valid_json(valid) is True assert UkBinCollectionConfigFlow.is_valid_json(invalid) is False # --------------------------- # Tests for async_step_user # --------------------------- @pytest.mark.asyncio async def test_async_step_user_missing_fields(hass): """Test async_step_user returns errors when required fields are missing.""" flow = UkBinCollectionConfigFlow() flow.hass = hass # Set councils data so that form is rendered flow.councils_data = MOCK_COUNCILS_DATA flow.council_names = list(MOCK_COUNCILS_DATA.keys()) flow.council_options = [ MOCK_COUNCILS_DATA[name]["wiki_name"] for name in flow.council_names ] # Missing both 'name' and 'council' result = await flow.async_step_user(user_input={"name": "", "council": ""}) assert result["type"] == "form" assert result["step_id"] == "user" assert "name" in result["errors"] assert "council" in result["errors"] @pytest.mark.asyncio async def test_async_step_user_invalid_icon_mapping(hass): """Test async_step_user returns error for invalid icon_color_mapping JSON.""" flow = UkBinCollectionConfigFlow() flow.hass = hass flow.councils_data = MOCK_COUNCILS_DATA flow.council_names = list(MOCK_COUNCILS_DATA.keys()) flow.council_options = [ MOCK_COUNCILS_DATA[name]["wiki_name"] for name in flow.council_names ] result = await flow.async_step_user( user_input={ "name": "Test Name", "council": MOCK_COUNCILS_DATA["CouncilTest"]["wiki_name"], "icon_color_mapping": "not a json", } ) assert result["type"] == "form" assert result["errors"] == {"icon_color_mapping": "Invalid JSON format."} @pytest.mark.asyncio async def test_async_step_user_no_councils(hass): """Test async_step_user aborts when councils data cannot be fetched.""" flow = UkBinCollectionConfigFlow() flow.hass = hass # Patch get_councils_json to return an empty dict (simulate failure) with patch.object(flow, "get_councils_json", return_value={}): result = await flow.async_step_user( user_input={"name": "Test", "council": "CouncilTest"} ) assert result["type"] == "abort" assert result["reason"] == "Council Data Unavailable" # --------------------------- # Tests for async_step_council # --------------------------- @pytest.mark.asyncio async def test_async_step_council_skip_get_url(hass): """Test that async_step_council sets skip_get_url when required.""" flow = UkBinCollectionConfigFlow() flow.hass = hass # Set up data so that the council in question requires URL skipping. flow.data = {"name": "Test", "council": "CouncilSkip"} flow.councils_data = MOCK_COUNCILS_DATA # Provide minimal user input (e.g. only timeout) user_input = {"timeout": 60} result = await flow.async_step_council(user_input=user_input) # In a real flow, if no errors are present the entry would be created. # Here, we simply verify that the user input was merged with skip_get_url. if "data" in result: # If the flow creates an entry, check that skip_get_url is present. assert result["data"].get("skip_get_url") is True assert result["data"].get("url") == MOCK_COUNCILS_DATA["CouncilSkip"].get("url") else: # Otherwise, the form is returned with no errors. assert result["type"] == "form" # --------------------------- # Tests for reconfigure steps # --------------------------- @pytest.mark.asyncio async def test_async_step_reconfigure_confirm_user_input_none(hass): """Test async_step_reconfigure_confirm returns form when no user input is provided.""" flow = UkBinCollectionConfigFlow() flow.hass = hass # Create a dummy config entry. dummy_entry = DummyEntry( { "name": "Test Name", "council": "CouncilTest", "uprn": "1234567890", "timeout": 60, }, entry_id="dummy", ) # Make sure async_get_entry returns a plain entry. hass.config_entries.async_get_entry = MagicMock(return_value=dummy_entry) flow.config_entry = dummy_entry flow.context = {"entry_id": dummy_entry.entry_id} flow.councils_data = MOCK_COUNCILS_DATA result = await flow.async_step_reconfigure_confirm(user_input=None) assert result["type"] == "form" assert result["step_id"] == "reconfigure_confirm" @pytest.mark.asyncio async def test_async_step_reconfigure_confirm_invalid_json(hass): """Test async_step_reconfigure_confirm returns errors with invalid JSON mapping and update_interval.""" flow = UkBinCollectionConfigFlow() flow.hass = hass dummy_entry = DummyEntry( { "name": "Existing Entry", "council": "CouncilTest", "uprn": "1234567890", "timeout": 60, }, entry_id="dummy", ) hass.config_entries.async_get_entry = MagicMock(return_value=dummy_entry) flow.config_entry = dummy_entry flow.context = {"entry_id": dummy_entry.entry_id} flow.councils_data = MOCK_COUNCILS_DATA # Patch async_update_entry and async_reload (they won't be used if there are errors) hass.config_entries.async_update_entry = MagicMock() hass.config_entries.async_reload = AsyncMock() user_input = { "name": "Updated Entry", "council": MOCK_COUNCILS_DATA["CouncilTest"]["wiki_name"], "update_interval": "0", # invalid (less than 1) "icon_color_mapping": "invalid json", "uprn": "0987654321", "timeout": 120, } result = await flow.async_step_reconfigure_confirm(user_input=user_input) assert result["type"] == "form" assert result["step_id"] == "reconfigure_confirm" # Expect errors for update_interval and icon_color_mapping. assert "update_interval" in result["errors"] assert "icon_color_mapping" in result["errors"] # --------------------------- # Test get_councils_json failure # --------------------------- @pytest.mark.asyncio async def test_get_councils_json_failure(hass): flow = UkBinCollectionConfigFlow() flow.hass = hass with patch("aiohttp.ClientSession") as mock_session_cls: # Simulate network error. mock_session = mock_session_cls.return_value.__aenter__.return_value mock_session.get.side_effect = Exception("Network error") result = await flow.get_councils_json() assert result == {} # --------------------------- # Test get_council_schema # --------------------------- @pytest.mark.asyncio async def test_get_council_schema(hass): flow = UkBinCollectionConfigFlow() flow.hass = hass flow.councils_data = { "CouncilTest": { "wiki_name": "Council Test", "skip_get_url": False, "uprn": True, "postcode": True, "house_number": True, "usrn": True, "web_driver": True, } } schema = await flow.get_council_schema("CouncilTest") # Check that required fields appear in the schema. required_fields = ["url", "uprn", "postcode", "number", "usrn", "timeout"] for field in required_fields: assert field in schema.schema # --------------------------- # Test build_reconfigure_schema # --------------------------- def test_build_reconfigure_schema(hass): flow = UkBinCollectionConfigFlow() flow.council_names = ["CouncilTest"] flow.council_options = ["Council Test"] existing_data = { "name": "Old Name", "council": "CouncilTest", "update_interval": 12, "url": "https://example.com", "icon_color_mapping": "{}", } schema = flow.build_reconfigure_schema(existing_data, "Council Test") assert isinstance(schema, vol.Schema) schema_dict = schema.schema assert "name" in schema_dict assert "council" in schema_dict assert "update_interval" in schema_dict # --------------------------- # Test async_step_import # --------------------------- @pytest.mark.asyncio async def test_async_step_import(hass): """Test that import flows call async_step_user.""" with patch( "custom_components.uk_bin_collection.config_flow.UkBinCollectionConfigFlow.get_councils_json", return_value=MOCK_COUNCILS_DATA, ): flow = UkBinCollectionConfigFlow() flow.hass = hass import_config = {"name": "Imported", "council": "Council Test", "uprn": "111"} result = await flow.async_step_import(import_config) assert result is not None @pytest.mark.asyncio async def test_options_flow_no_councils(dummy_hass): """Test async_step_init aborts if get_councils_json returns empty data.""" config_entry = MockConfigEntry( domain=DOMAIN, data={"name": "Test Options"}, entry_id="opt_test" ) config_entry.add_to_hass(dummy_hass) flow = UkBinCollectionOptionsFlowHandler(config_entry) flow.hass = dummy_hass # Patch get_councils_json to return an empty dict flow.get_councils_json = AsyncMock(return_value={}) result = await flow.async_step_init(user_input=None) # Expect an abort with reason "Council Data Unavailable" assert result["reason"] == "Council Data Unavailable" def test_build_options_schema(options_flow): """Test that build_options_schema returns a schema with expected keys.""" flow, config_entry = options_flow # Set up the lists for schema building flow.council_names = ["CouncilTest"] flow.council_options = ["Council Test"] existing_data = { "name": "Test Options", "council": "CouncilTest", "update_interval": 12, "icon_color_mapping": '{"CouncilTest": {"icon": "mdi:trash", "color": "green"}}', } schema = flow.build_options_schema(existing_data) sample = schema( {"name": "Test Options", "council": "Council Test", "update_interval": 12} ) assert isinstance(sample, dict) sample_with_optional = schema( { "name": "Test Options", "council": "Council Test", "update_interval": 12, "icon_color_mapping": '{"key": "value"}', } ) assert "icon_color_mapping" in sample_with_optional def test_map_wiki_name_to_council_key(options_flow): """Test mapping from wiki name to council key.""" flow, _ = options_flow flow.council_options = ["Council Test"] flow.council_names = ["CouncilTest"] assert flow.map_wiki_name_to_council_key("Council Test") == "CouncilTest" assert flow.map_wiki_name_to_council_key("Nonexistent") == "" def test_is_valid_json(): """Test is_valid_json for valid and invalid JSON.""" from custom_components.uk_bin_collection.config_flow import ( UkBinCollectionOptionsFlowHandler, ) valid = '{"key": "value"}' invalid = '{"key": "value" "missing_comma": true}' assert UkBinCollectionOptionsFlowHandler.is_valid_json(valid) is True assert UkBinCollectionOptionsFlowHandler.is_valid_json(invalid) is False # --- Test: Helper method is_valid_json --- def test_is_valid_json_options(): valid = '{"key": "value"}' invalid = '{"key": "value",}' # trailing comma assert UkBinCollectionOptionsFlowHandler.is_valid_json(valid) is True assert UkBinCollectionOptionsFlowHandler.is_valid_json(invalid) is False ================================================ FILE: custom_components/uk_bin_collection/tests/test_init.py ================================================ # test_init.py import asyncio import json from datetime import datetime, timedelta from unittest.mock import AsyncMock, MagicMock, patch import pytest from homeassistant.config_entries import ConfigEntry from homeassistant.exceptions import ConfigEntryNotReady from homeassistant.helpers.update_coordinator import UpdateFailed # Import the functions and classes from your __init__.py file. from custom_components.uk_bin_collection import ( HouseholdBinCoordinator, async_migrate_entry, async_setup, async_setup_entry, async_unload_entry, build_ukbcd_args, ) from custom_components.uk_bin_collection.const import DOMAIN, LOG_PREFIX, PLATFORMS from uk_bin_collection.uk_bin_collection.collect_data import UKBinCollectionApp class DummyUKBinCollectionApp: def __init__(self): self.args = None # Set parsed_args so that self.parsed_args.module is valid (using "json" here as an example) self.parsed_args = type("DummyArgs", (), {"module": "json"})() def set_args(self, args): self.args = args self.parsed_args = type("DummyArgs", (), {"module": "json"})() def run(self): # Return valid JSON data expected by the coordinator. return json.dumps( { "bins": [ { "type": "waste", "collectionDate": datetime.now().strftime("%d/%m/%Y"), }, { "type": "recycling", "collectionDate": (datetime.now() + timedelta(days=1)).strftime( "%d/%m/%Y" ), }, ] } ) # Create a dummy config entry for testing. class DummyConfigEntry: def __init__(self, data, version=1, entry_id="dummy_entry"): self.data = data self.version = version self.entry_id = entry_id # Create a dummy HomeAssistant object. class DummyHass: def __init__(self): self.data = {} self.services = Services() self.config_entries = ConfigEntries() async def async_add_executor_job(self, func, *args, **kwargs): # In tests, we can simply run the function synchronously. return func(*args, **kwargs) class Services: def __init__(self): self.registrations = {} def async_register(self, domain, service, service_func): self.registrations[(domain, service)] = service_func class ConfigEntries: async def async_forward_entry_setups(self, config_entry, platforms): # In tests, simply return an empty list (or you can simulate something). return [] async def async_forward_entry_unload(self, config_entry, platform): # Simulate a successful unload. return True def async_update_entry(self, config_entry, data): config_entry.data = data @pytest.fixture def hass(): return DummyHass() @pytest.fixture def dummy_config_entry(): data = { "name": "Test Entry", "timeout": 60, "manual_refresh_only": True, "icon_color_mapping": "{}", "update_interval": 12, "council": "json", # Use a valid module name "url": "http://example.com", } return DummyConfigEntry(data) @pytest.mark.asyncio async def test_household_bin_coordinator_retains_last_good_data(hass): # Create a dummy app with dynamic run output class DynamicUKBinCollectionApp: def __init__(self): self.call_count = 0 def set_args(self, args): pass def run(self): self.call_count += 1 if self.call_count == 1: # First call: valid data return json.dumps( { "bins": [ { "type": "waste", "collectionDate": datetime.now().strftime("%d/%m/%Y"), }, ] } ) else: # Second call: empty bins return json.dumps({"bins": []}) dummy_app = DynamicUKBinCollectionApp() coordinator = HouseholdBinCoordinator( hass, dummy_app, name="Test Bin", timeout=2, update_interval=timedelta(minutes=5), ) # First fetch - stores valid data first_data = await coordinator._async_update_data() assert "waste" in first_data # Second fetch - empty, should fall back to previous data second_data = await coordinator._async_update_data() assert second_data == first_data # Confirm fallback occurred # --- Test async_setup --- @pytest.mark.asyncio async def test_async_setup_success(hass): # Call async_setup with a dummy config config = {"uk_bin_collection": {}} result = await async_setup(hass, config) assert result is True # Check that the integration data was initialized. assert DOMAIN in hass.data @pytest.mark.asyncio async def test_manual_refresh_no_entry(hass): # Call async_setup to register the service. config = {"uk_bin_collection": {}} await async_setup(hass, config) # Get the manual refresh service. service_func = hass.services.registrations.get((DOMAIN, "manual_refresh")) # Create a dummy call with no entry_id. dummy_call = MagicMock() dummy_call.data = {} # Capture log output or simply run the service call. await service_func(dummy_call) # You might check logs to verify the error was logged. # --- Test async_migrate_entry --- @pytest.mark.asyncio async def test_async_migrate_entry_version_1(hass, dummy_config_entry): dummy_config_entry.version = 1 # Remove update_interval to test migration defaults. dummy_config_entry.data.pop("update_interval", None) result = await async_migrate_entry(hass, dummy_config_entry) assert result is True # Now update_interval should be set to 12. assert dummy_config_entry.data["update_interval"] == 12 @pytest.mark.asyncio async def test_async_migrate_entry_no_migration(hass, dummy_config_entry): dummy_config_entry.version = 2 result = await async_migrate_entry(hass, dummy_config_entry) assert result is True # --- Test async_setup_entry --- @pytest.mark.asyncio async def test_async_setup_entry_success(hass, dummy_config_entry): # Ensure hass.data[DOMAIN] is initialized. hass.data.setdefault(DOMAIN, {}) # Patch the UKBinCollectionApp in the integration's namespace. with patch( "custom_components.uk_bin_collection.UKBinCollectionApp", return_value=DummyUKBinCollectionApp(), ): result = await async_setup_entry(hass, dummy_config_entry) assert result is True # Verify that the coordinator was stored in hass.data. assert dummy_config_entry.entry_id in hass.data[DOMAIN] @pytest.mark.asyncio async def test_async_setup_entry_missing_name(hass, dummy_config_entry): # Remove "name" to force a failure. dummy_config_entry.data.pop("name") with pytest.raises(ConfigEntryNotReady): await async_setup_entry(hass, dummy_config_entry) # --- Test async_unload_entry --- @pytest.mark.asyncio async def test_async_unload_entry_success(hass, dummy_config_entry): # Prepopulate hass.data with a dummy coordinator. hass.data.setdefault(DOMAIN, {})[dummy_config_entry.entry_id] = { "coordinator": "dummy" } result = await async_unload_entry(hass, dummy_config_entry) assert result is True # The coordinator should have been removed. assert dummy_config_entry.entry_id not in hass.data[DOMAIN] # --- Test build_ukbcd_args --- def test_build_ukbcd_args_excludes_keys(): config_data = { "council": "Test Council", "url": "http://example.com", "skip_get_url": "should be excluded", "custom_arg": "value", } args = build_ukbcd_args(config_data) # Check that the first two arguments are the council and url. assert args[0] == "Test Council" assert args[1] == "http://example.com" # The custom_arg should be included, but skip_get_url should not. args_str = " ".join(args) assert "--custom_arg=value" in args_str assert "skip_get_url" not in args_str # --- Test HouseholdBinCoordinator update --- @pytest.mark.asyncio async def test_household_bin_coordinator_update(hass): # Create a dummy app whose run method returns valid JSON. dummy_app = DummyUKBinCollectionApp() coordinator = HouseholdBinCoordinator( hass, dummy_app, name="Test Bin", timeout=1, update_interval=timedelta(hours=1) ) # Test the _async_update_data method. data = await coordinator._async_update_data() # Expect the data to be a dict with at least one bin type. assert isinstance(data, dict) assert "waste" in data or "recycling" in data def test_process_bin_data_valid(): # Test process_bin_data with valid bin data. now_str = datetime.now().strftime("%d/%m/%Y") data = { "bins": [ {"type": "waste", "collectionDate": now_str}, {"type": "recycling", "collectionDate": now_str}, ] } processed = HouseholdBinCoordinator.process_bin_data(data) # Both bins should be in the processed output. assert "waste" in processed assert "recycling" in processed def test_process_bin_data_invalid(): # Test process_bin_data with missing keys and malformed date. data = { "bins": [ {"type": None, "collectionDate": "bad_date"}, {"collectionDate": "01/01/2025"}, ] } processed = HouseholdBinCoordinator.process_bin_data(data) # Should be empty because data was invalid. assert processed == {} ================================================ FILE: custom_components/uk_bin_collection/tests/test_sensor.py ================================================ import asyncio import json import logging from datetime import date, datetime, timedelta from json import JSONDecodeError from unittest.mock import AsyncMock, MagicMock, patch, Mock import pytest from freezegun import freeze_time from homeassistant.config_entries import ConfigEntryState from homeassistant.exceptions import ConfigEntryNotReady from homeassistant.helpers.update_coordinator import UpdateFailed from homeassistant.util import dt as dt_util from homeassistant.core import ServiceCall from custom_components.uk_bin_collection import ( async_setup_entry as async_setup_entry_domain, ) from custom_components.uk_bin_collection.sensor import ( async_setup_entry as async_setup_entry_sensor, ) from custom_components.uk_bin_collection.const import ( DOMAIN, LOG_PREFIX, STATE_ATTR_COLOUR, STATE_ATTR_NEXT_COLLECTION, STATE_ATTR_DAYS, ) from custom_components.uk_bin_collection.sensor import ( UKBinCollectionAttributeSensor, UKBinCollectionDataSensor, UKBinCollectionRawJSONSensor, create_sensor_entities, load_icon_color_mapping, ) from custom_components.uk_bin_collection import HouseholdBinCoordinator logging.basicConfig(level=logging.DEBUG) from .common_utils import MockConfigEntry pytest_plugins = ["freezegun"] # Mock Data MOCK_BIN_COLLECTION_DATA = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, {"type": "Recycling", "collectionDate": "16/10/2023"}, {"type": "Garden Waste", "collectionDate": "17/10/2023"}, ] } MOCK_PROCESSED_DATA = { "General Waste": datetime.strptime("15/10/2023", "%d/%m/%Y").date(), "Recycling": datetime.strptime("16/10/2023", "%d/%m/%Y").date(), "Garden Waste": datetime.strptime("17/10/2023", "%d/%m/%Y").date(), } @pytest.fixture def mock_config_entry(): """Create a mock ConfigEntry.""" return MockConfigEntry( domain=DOMAIN, title="Test Entry", data={ "name": "Test Name", "council": "Test Council", "url": "https://example.com", "timeout": 60, "icon_color_mapping": {}, }, entry_id="test", unique_id="test_unique_id", ) # Tests def test_process_bin_data(freezer): """Test processing of bin collection data.""" freezer.move_to("2023-10-14") processed_data = HouseholdBinCoordinator.process_bin_data(MOCK_BIN_COLLECTION_DATA) # Convert dates to strings for comparison processed_data_str = {k: v.strftime("%Y-%m-%d") for k, v in processed_data.items()} expected_data_str = { k: v.strftime("%Y-%m-%d") for k, v in MOCK_PROCESSED_DATA.items() } assert processed_data_str == expected_data_str def test_process_bin_data_empty(): """Test processing when data is empty.""" processed_data = HouseholdBinCoordinator.process_bin_data({"bins": []}) assert processed_data == {} def test_process_bin_data_past_dates(freezer): """Test processing when all dates are in the past.""" freezer.move_to("2023-10-14") past_date = (datetime(2023, 10, 14) - timedelta(days=1)).strftime("%d/%m/%Y") data = { "bins": [ {"type": "General Waste", "collectionDate": past_date}, ] } processed_data = HouseholdBinCoordinator.process_bin_data(data) assert processed_data == {} # No future dates def test_process_bin_data_duplicate_bin_types(freezer): """Test processing when duplicate bin types are present.""" freezer.move_to("2023-10-14") data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, {"type": "General Waste", "collectionDate": "16/10/2023"}, # Later date ] } expected = { "General Waste": date(2023, 10, 15), # Should take the earliest future date } processed_data = HouseholdBinCoordinator.process_bin_data(data) assert processed_data == expected def test_unique_id_uniqueness(): """Test that each sensor has a unique ID.""" coordinator = MagicMock() coordinator.name = "Test Name" coordinator.data = MOCK_PROCESSED_DATA sensor1 = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", {} ) sensor2 = UKBinCollectionDataSensor(coordinator, "Recycling", "test_recycling", {}) assert sensor1.unique_id == "test_general_waste" assert sensor2.unique_id == "test_recycling" assert sensor1.unique_id != sensor2.unique_id @pytest.mark.asyncio @freeze_time("2023-10-14") async def test_async_setup_entry(hass, mock_config_entry): """Test setting up the sensor platform directly.""" # 1) We need to fake the coordinator in hass.data hass.data = {} hass.data.setdefault(DOMAIN, {}) # Create a mock coordinator (or real if you like) mock_coordinator = MagicMock() # Store it under the entry_id as normal domain code would do hass.data[DOMAIN][mock_config_entry.entry_id] = {"coordinator": mock_coordinator} # 2) Prepare a mock to track added entities async_add_entities = Mock() # 3) Patch sensor’s UKBinCollectionApp calls if needed with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps({"bins": []}) with patch.object( hass, "async_add_executor_job", new_callable=AsyncMock, return_value=mock_app_instance.run.return_value, ): # 4) Now call the sensor setup function await async_setup_entry_sensor(hass, mock_config_entry, async_add_entities) # 5) Assert that sensor got set up assert async_add_entities.call_count == 1 # ... any other assertions you want @freeze_time("2023-10-14") @pytest.mark.asyncio async def test_coordinator_fetch(hass): """Test the data fetch by the coordinator.""" with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(MOCK_BIN_COLLECTION_DATA) with patch.object( hass, "async_add_executor_job", new_callable=AsyncMock, return_value=mock_app_instance.run.return_value, ): coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_refresh() assert ( coordinator.data == MOCK_PROCESSED_DATA ), "Coordinator data does not match expected values." assert ( coordinator.last_update_success is True ), "Coordinator update was not successful." @pytest.mark.asyncio async def test_bin_sensor(hass, mock_config_entry): """Test the main bin sensor.""" from freezegun import freeze_time hass.data = {} with freeze_time("2023-10-14"): with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(MOCK_BIN_COLLECTION_DATA) with patch.object( hass, "async_add_executor_job", return_value=mock_app_instance.run.return_value, ): coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", {} ) assert sensor.name == "Test Name General Waste" assert sensor.unique_id == "test_general_waste" assert sensor.state == "Tomorrow" assert sensor.icon == "mdi:trash-can" assert sensor.extra_state_attributes == { "colour": "black", "next_collection": "15/10/2023", "days": 1, } @freeze_time("2023-10-14") @pytest.mark.asyncio async def test_raw_json_sensor(hass, mock_config_entry): """Test the raw JSON sensor.""" hass.data = {} with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(MOCK_BIN_COLLECTION_DATA) with patch.object( hass, "async_add_executor_job", new_callable=AsyncMock, return_value=mock_app_instance.run.return_value, ): coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_refresh() sensor = UKBinCollectionRawJSONSensor(coordinator, "test_raw_json", "Test Name") expected_state = json.dumps( {k: v.strftime("%d/%m/%Y") for k, v in MOCK_PROCESSED_DATA.items()} ) assert sensor.name == "Test Name Raw JSON" assert sensor.unique_id == "test_raw_json" assert sensor.state == expected_state assert sensor.extra_state_attributes == {"raw_data": MOCK_PROCESSED_DATA} @pytest.mark.asyncio async def test_bin_sensor_custom_icon_color(hass, mock_config_entry): """Test bin sensor with custom icon and color.""" icon_color_mapping = {"General Waste": {"icon": "mdi:delete", "color": "green"}} # Initialize hass.data hass.data = {} # Patch UKBinCollectionApp with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value # Mock run method to return JSON data for testing mock_app_instance.run.return_value = json.dumps(MOCK_BIN_COLLECTION_DATA) # Mock async_add_executor_job correctly with patch.object( hass, "async_add_executor_job", new=AsyncMock(return_value=mock_app_instance.run.return_value), ): # Create the coordinator coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) # Perform the first refresh await coordinator.async_config_entry_first_refresh() # Create a bin sensor with custom icon and color mapping sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", icon_color_mapping ) # Access properties assert sensor.icon == "mdi:delete" assert sensor.extra_state_attributes["colour"] == "green" @pytest.mark.asyncio async def test_bin_sensor_today_collection(hass, freezer, mock_config_entry): """Test bin sensor when collection is today.""" freezer.move_to("2023-10-14") today_date = dt_util.now().strftime("%d/%m/%Y") data = { "bins": [ {"type": "General Waste", "collectionDate": today_date}, ] } # Initialize hass.data hass.data = {} with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value # Mock run method to return JSON data for testing mock_app_instance.run.return_value = json.dumps(data) # Mock async_add_executor_job correctly with patch.object( hass, "async_add_executor_job", new=AsyncMock(return_value=mock_app_instance.run.return_value), ): # Create the coordinator coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) # Perform the first refresh await coordinator.async_config_entry_first_refresh() # Create a bin sensor sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", {} ) # Access properties assert sensor.state == "Today" @pytest.mark.asyncio async def test_bin_sensor_tomorrow_collection(hass, freezer, mock_config_entry): """Test bin sensor when collection is tomorrow.""" freezer.move_to("2023-10-14") tomorrow_date = (dt_util.now() + timedelta(days=1)).strftime("%d/%m/%Y") data = { "bins": [ {"type": "General Waste", "collectionDate": tomorrow_date}, ] } # Initialize hass.data hass.data = {} with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value # Mock run method to return JSON data for testing mock_app_instance.run.return_value = json.dumps(data) # Mock async_add_executor_job correctly with patch.object( hass, "async_add_executor_job", new=AsyncMock(return_value=mock_app_instance.run.return_value), ): # Create the coordinator coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) # Perform the first refresh await coordinator.async_config_entry_first_refresh() # Create a bin sensor sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", {} ) # Access properties assert sensor.state == "Tomorrow" @pytest.mark.asyncio async def test_bin_sensor_partial_custom_icon_color(hass, mock_config_entry): """Test bin sensor with partial custom icon and color mappings.""" icon_color_mapping = {"General Waste": {"icon": "mdi:delete", "color": "green"}} # Modify json.dumps(MOCK_BIN_COLLECTION_DATA) to include another bin type without custom mapping custom_data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, {"type": "Recycling", "collectionDate": "16/10/2023"}, ] } # Initialize hass.data hass.data = {} with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(custom_data) # Mock async_add_executor_job correctly with patch.object( hass, "async_add_executor_job", new=AsyncMock(return_value=mock_app_instance.run.return_value), ): # Create the coordinator coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) # Perform the first refresh await coordinator.async_config_entry_first_refresh() # Create sensors for both bin types sensor_general = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", icon_color_mapping ) sensor_recycling = UKBinCollectionDataSensor( coordinator, "Recycling", "test_recycling", icon_color_mapping ) # Check custom mapping for General Waste assert sensor_general.icon == "mdi:delete" assert sensor_general.extra_state_attributes["colour"] == "green" # Check default mapping for Recycling assert sensor_recycling.icon == "mdi:recycle" assert sensor_recycling.extra_state_attributes["colour"] == "black" def test_unique_id_uniqueness(hass, mock_config_entry): """Test that each sensor has a unique ID.""" coordinator = MagicMock() coordinator.name = "Test Name" coordinator.data = MOCK_PROCESSED_DATA sensor1 = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", {} ) sensor2 = UKBinCollectionDataSensor(coordinator, "Recycling", "test_recycling", {}) assert sensor1.unique_id == "test_general_waste" assert sensor2.unique_id == "test_recycling" assert sensor1.unique_id != sensor2.unique_id @pytest.fixture def mock_dt_now_different_timezone(): """Mock datetime.now with a different timezone.""" with patch( "homeassistant.util.dt.now", return_value=datetime(2023, 10, 14, 12, 0, tzinfo=dt_util.UTC), ): yield async def test_raw_json_sensor_invalid_data(hass, mock_config_entry): """Test raw JSON sensor with invalid data.""" invalid_data = "Invalid JSON String" with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = invalid_data # Not a valid JSON from custom_components.uk_bin_collection.sensor import HouseholdBinCoordinator coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) # Attempt to refresh coordinator, which should NOT raise UpdateFailed await coordinator.async_refresh() # Verify that last_update_success is False assert coordinator.last_update_success is False # Create the raw JSON sensor sensor = UKBinCollectionRawJSONSensor(coordinator, "test_raw_json", "Test Name") # Since data fetch failed, sensor.state should reflect the failure assert sensor.state == json.dumps({}) assert sensor.extra_state_attributes == {"raw_data": {}} assert sensor.available is False def test_sensor_device_info(hass, mock_config_entry): """Test that sensors report correct device information.""" coordinator = MagicMock() coordinator.name = "Test Name" coordinator.data = MOCK_PROCESSED_DATA sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", {} ) expected_device_info = { "identifiers": {(DOMAIN, "test_general_waste")}, "name": "Test Name General Waste", "manufacturer": "UK Bin Collection", "model": "Bin Sensor", "sw_version": "1.0", } assert sensor.device_info == expected_device_info def process_bin_data_duplicate_bin_types(freezer): """Test processing when duplicate bin types are present.""" freezer.move_to("2023-10-14") data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, {"type": "General Waste", "collectionDate": "16/10/2023"}, # Later date ] } expected = { "General Waste": "15/10/2023", # Should take the earliest future date } processed_data = HouseholdBinCoordinator.process_bin_data(data) assert processed_data == expected @pytest.mark.asyncio async def test_coordinator_timeout_error(hass, mock_config_entry): """Test coordinator handles timeout errors correctly.""" with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value # Simulate run raising TimeoutError mock_app_instance.run.side_effect = asyncio.TimeoutError("Request timed out") # Mock async_add_executor_job to raise TimeoutError hass.async_add_executor_job = AsyncMock( side_effect=mock_app_instance.run.side_effect ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=1 ) # Expect ConfigEntryNotReady instead of UpdateFailed with pytest.raises(ConfigEntryNotReady) as exc_info: await coordinator.async_config_entry_first_refresh() assert "Timeout while updating data" in str(exc_info.value) @pytest.mark.asyncio async def test_coordinator_json_decode_error(hass, mock_config_entry): """Test coordinator handles JSON decode errors correctly.""" with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value # Simulate run returning invalid JSON mock_app_instance.run.return_value = "Invalid JSON String" # Mock async_add_executor_job to raise JSONDecodeError def side_effect(*args, **kwargs): raise JSONDecodeError("Expecting value", "Invalid JSON String", 0) hass.async_add_executor_job = AsyncMock(side_effect=side_effect) # Initialize hass.data hass.data = {} coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) # Expect ConfigEntryNotReady instead of UpdateFailed with pytest.raises(ConfigEntryNotReady) as exc_info: await coordinator.async_config_entry_first_refresh() assert "JSON decode error" in str(exc_info.value) @pytest.mark.asyncio async def test_coordinator_general_exception(hass, mock_config_entry): """Test coordinator handles general exceptions correctly.""" with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value # Simulate run raising a general exception mock_app_instance.run.side_effect = Exception("General error") # Mock async_add_executor_job to raise the exception hass.async_add_executor_job = AsyncMock( side_effect=mock_app_instance.run.side_effect ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) # Expect ConfigEntryNotReady instead of UpdateFailed with pytest.raises(ConfigEntryNotReady) as exc_info: await coordinator.async_config_entry_first_refresh() assert "Unexpected error" in str(exc_info.value) def process_bin_data_duplicate_bin_types(freezer): """Test processing when duplicate bin types are present with different dates.""" freezer.move_to("2023-10-14") data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, {"type": "General Waste", "collectionDate": "14/10/2023"}, # Earlier date ] } expected = { "General Waste": "14/10/2023", # Should take the earliest future date } processed_data = HouseholdBinCoordinator.process_bin_data(data) assert processed_data == expected def process_bin_data_past_dates(freezer): """Test processing when all dates are in the past.""" freezer.move_to("2023-10-14") past_date = (dt_util.now() - timedelta(days=1)).strftime("%d/%m/%Y") data = { "bins": [ {"type": "General Waste", "collectionDate": past_date}, {"type": "Recycling", "collectionDate": past_date}, ] } processed_data = HouseholdBinCoordinator.process_bin_data(data) assert processed_data == {} # No future dates should be included def process_bin_data_missing_fields(freezer): """Test processing when some bins are missing required fields.""" freezer.move_to("2023-10-14") data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, {"collectionDate": "16/10/2023"}, # Missing 'type' {"type": "Recycling"}, # Missing 'collectionDate' ] } expected = { "General Waste": "15/10/2023", } processed_data = HouseholdBinCoordinator.process_bin_data(data) assert processed_data == expected def process_bin_data_invalid_date_format(freezer): """Test processing when bins have invalid date formats.""" freezer.move_to("2023-10-14") data = { "bins": [ { "type": "General Waste", "collectionDate": "2023-10-15", }, # Incorrect format {"type": "Recycling", "collectionDate": "16/13/2023"}, # Invalid month ] } processed_data = HouseholdBinCoordinator.process_bin_data(data) assert processed_data == {} # Both entries should be skipped due to invalid dates @pytest.mark.asyncio async def test_bin_sensor_state_today(hass, mock_config_entry, freezer): """Test bin sensor when collection is today.""" freezer.move_to("2023-10-14") today_date = dt_util.now().strftime("%d/%m/%Y") data = { "bins": [ {"type": "General Waste", "collectionDate": today_date}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(data) # Mock async_add_executor_job to return the run method's return value hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", {} ) assert sensor.state == "Today" assert sensor.available is True assert sensor.extra_state_attributes["days"] == 0 @pytest.mark.asyncio async def test_bin_sensor_state_tomorrow(hass, mock_config_entry, freezer): """Test bin sensor when collection is tomorrow.""" freezer.move_to("2023-10-14") tomorrow_date = (dt_util.now() + timedelta(days=1)).strftime("%d/%m/%Y") data = { "bins": [ {"type": "Recycling", "collectionDate": tomorrow_date}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(data) hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() sensor = UKBinCollectionDataSensor(coordinator, "Recycling", "test_recycling", {}) assert sensor.state == "Tomorrow" assert sensor.available is True assert sensor.extra_state_attributes["days"] == 1 @pytest.mark.asyncio async def test_bin_sensor_state_in_days(hass, mock_config_entry, freezer): """Test bin sensor when collection is in multiple days.""" freezer.move_to("2023-10-14") future_date = (dt_util.now() + timedelta(days=5)).strftime("%d/%m/%Y") data = { "bins": [ {"type": "Garden Waste", "collectionDate": future_date}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(data) hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() sensor = UKBinCollectionDataSensor( coordinator, "Garden Waste", "test_garden_waste", {} ) assert sensor.state == "In 5 days" assert sensor.available is True assert sensor.extra_state_attributes["days"] == 5 @pytest.mark.asyncio async def test_bin_sensor_missing_data(hass, mock_config_entry): """Test bin sensor when bin data is missing.""" data = { "bins": [ # No bins provided ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(data) hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() sensor = UKBinCollectionDataSensor( coordinator, "Non-Existent Bin", "test_non_existent_bin", {} ) assert sensor.state == "Unknown" assert sensor.available is False assert sensor.extra_state_attributes["days"] is None assert sensor.extra_state_attributes["next_collection"] is None @freeze_time("2023-10-14") @pytest.mark.asyncio async def test_raw_json_sensor_invalid_data(hass, mock_config_entry): """Test raw JSON sensor with invalid data.""" invalid_data = "Invalid JSON String" with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = invalid_data def side_effect(*args, **kwargs): raise JSONDecodeError("Expecting value", invalid_data, 0) with patch.object(hass, "async_add_executor_job", side_effect=side_effect): coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_refresh() assert not coordinator.last_update_success raw_json_sensor = UKBinCollectionRawJSONSensor( coordinator, "test_raw_json", "Test Name" ) assert raw_json_sensor.state == "{}" assert raw_json_sensor.extra_state_attributes["raw_data"] == {} assert raw_json_sensor.available is False @pytest.mark.asyncio async def test_sensor_available_property(hass, mock_config_entry): """Test that sensor's available property reflects its state.""" # Case 1: State is a valid string data_valid = { "bins": [ {"type": "Recycling", "collectionDate": "16/10/2023"}, ] } processed_data_valid = { "Recycling": datetime.strptime("16/10/2023", "%d/%m/%Y").date(), } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app_valid: mock_app_valid_instance = mock_app_valid.return_value mock_app_valid_instance.run.return_value = json.dumps(data_valid) with patch.object( hass, "async_add_executor_job", return_value=mock_app_valid_instance.run.return_value, ): coordinator_valid = HouseholdBinCoordinator( hass, mock_app_valid_instance, "Test Name", timeout=60 ) await coordinator_valid.async_refresh() sensor_valid = UKBinCollectionDataSensor( coordinator_valid, "Recycling", "test_recycling_available", {} ) assert sensor_valid.available is True # Case 2: State is "Unknown" data_unknown = {"bins": []} with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app_unknown: mock_app_unknown_instance = mock_app_unknown.return_value mock_app_unknown_instance.run.return_value = json.dumps(data_unknown) with patch.object( hass, "async_add_executor_job", return_value=mock_app_unknown_instance.run.return_value, ): coordinator_unknown = HouseholdBinCoordinator( hass, mock_app_unknown_instance, "Test Name", timeout=60 ) await coordinator_unknown.async_refresh() sensor_unknown = UKBinCollectionDataSensor( coordinator_unknown, "Garden Waste", "test_garden_waste_unavailable", {} ) assert sensor_unknown.available is False @pytest.mark.asyncio async def test_data_sensor_missing_icon_or_color(hass, mock_config_entry): """Test data sensor uses default icon and color when mappings are missing.""" icon_color_mapping = { "General Waste": {"icon": "mdi:trash-can"}, # Missing 'color' "Recycling": {"color": "green"}, # Missing 'icon' "Garden Waste": {}, # Missing both } data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, {"type": "Recycling", "collectionDate": "16/10/2023"}, {"type": "Garden Waste", "collectionDate": "17/10/2023"}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(data) hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() # Test General Waste sensor (missing 'color') general_waste_sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", icon_color_mapping ) # Simulate coordinator update coordinator.async_set_updated_data(coordinator.data) assert general_waste_sensor.icon == "mdi:trash-can" assert general_waste_sensor._color == "black" # Default color # Test Recycling sensor (missing 'icon') recycling_sensor = UKBinCollectionDataSensor( coordinator, "Recycling", "test_recycling", icon_color_mapping ) coordinator.async_set_updated_data(coordinator.data) assert recycling_sensor.icon == "mdi:recycle" # Default icon based on bin type assert recycling_sensor._color == "green" # Test Garden Waste sensor (missing both) garden_waste_sensor = UKBinCollectionDataSensor( coordinator, "Garden Waste", "test_garden_waste", icon_color_mapping ) coordinator.async_set_updated_data(coordinator.data) assert garden_waste_sensor.icon == "mdi:trash-can" # Default icon based on bin type assert garden_waste_sensor._color == "black" @pytest.mark.asyncio async def test_attribute_sensor_with_complete_mappings(hass, mock_config_entry): """Test attribute sensor correctly applies icon and color from mappings.""" icon_color_mapping = {"General Waste": {"icon": "mdi:trash-can", "color": "grey"}} data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(data) # Mock async_add_executor_job to return valid JSON hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) # Initialize hass.data hass.data = {} coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() # Test Colour attribute sensor colour_sensor = UKBinCollectionAttributeSensor( coordinator, "General Waste", "test_general_waste_colour", "Colour", "test_general_waste", icon_color_mapping, ) # Simulate coordinator update coordinator.async_set_updated_data(coordinator.data) assert colour_sensor.state == "grey" assert colour_sensor.icon == "mdi:trash-can" assert colour_sensor._color == "grey" @pytest.mark.asyncio async def test_data_sensor_color_property_missing_or_none(hass, mock_config_entry): """Test sensor's color property when color is missing or None.""" # Case 1: Missing color in icon_color_mapping icon_color_mapping_missing_color = { "General Waste": {"icon": "mdi:trash-can"}, } data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app_missing_color: mock_app_missing_color_instance = mock_app_missing_color.return_value mock_app_missing_color_instance.run.return_value = json.dumps(data) hass.async_add_executor_job = AsyncMock( return_value=mock_app_missing_color_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_missing_color_instance, "Test Name", timeout=60, ) await coordinator.async_config_entry_first_refresh() sensor_missing_color = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste_missing_color", icon_color_mapping_missing_color, ) # Simulate coordinator update coordinator.async_set_updated_data(coordinator.data) assert sensor_missing_color._color == "black" # Default color # Case 2: Color is None icon_color_mapping_none_color = { "Recycling": {"icon": "mdi:recycle", "color": None}, } data_none_color = { "bins": [ {"type": "Recycling", "collectionDate": "16/10/2023"}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app_none_color: mock_app_none_color_instance = mock_app_none_color.return_value mock_app_none_color_instance.run.return_value = json.dumps(data_none_color) hass.async_add_executor_job = AsyncMock( return_value=mock_app_none_color_instance.run.return_value ) coordinator_none_color = HouseholdBinCoordinator( hass, mock_app_none_color_instance, "Test Name", timeout=60, ) await coordinator_none_color.async_config_entry_first_refresh() sensor_none_color = UKBinCollectionDataSensor( coordinator_none_color, "Recycling", "test_recycling_none_color", icon_color_mapping_none_color, ) # Simulate coordinator update coordinator_none_color.async_set_updated_data(coordinator_none_color.data) assert ( sensor_none_color._color == "black" ) # Should default to "black" if color is None @freeze_time("2023-10-14") @pytest.mark.asyncio async def test_sensor_available_property(hass, mock_config_entry): """Test that sensor's available property reflects its state.""" # Case 1: State is a valid string data_valid = { "bins": [ {"type": "Recycling", "collectionDate": "16/10/2023"}, ] } processed_data_valid = { "Recycling": datetime.strptime("16/10/2023", "%d/%m/%Y").date(), } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app_valid: mock_app_valid_instance = mock_app_valid.return_value mock_app_valid_instance.run.return_value = json.dumps(data_valid) async def mock_async_add_executor_job(func, *args, **kwargs): return func(*args, **kwargs) with patch.object( hass, "async_add_executor_job", side_effect=mock_async_add_executor_job, ): coordinator_valid = HouseholdBinCoordinator( hass, mock_app_valid_instance, "Test Name", timeout=60 ) await coordinator_valid.async_refresh() # Verify that coordinator.data contains the expected processed data assert coordinator_valid.data == processed_data_valid sensor_valid = UKBinCollectionDataSensor( coordinator_valid, "Recycling", "test_recycling_available", {} ) assert sensor_valid.available is True @pytest.mark.asyncio async def test_coordinator_empty_data(hass, mock_config_entry): """Test coordinator handles empty data correctly.""" empty_data = {"bins": []} with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(empty_data) hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() assert coordinator.data == {} assert coordinator.last_update_success is True def test_coordinator_custom_update_interval(hass, mock_config_entry): """Test that coordinator uses a custom update interval.""" custom_interval = timedelta(hours=6) coordinator = HouseholdBinCoordinator(hass, MagicMock(), "Test Name", timeout=60) coordinator.update_interval = custom_interval assert coordinator.update_interval == custom_interval @pytest.mark.asyncio async def test_async_setup_entry_missing_required_fields(hass): """Test domain-level setup fails if 'name' is missing.""" mock_config_entry = MockConfigEntry( domain=DOMAIN, data={ # no "name" "council": "Test Council", "url": "https://example.com", "timeout": 60, "icon_color_mapping": {}, }, entry_id="test_missing_name", ) with patch("custom_components.uk_bin_collection.UKBinCollectionApp") as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = "{}" hass.async_add_executor_job = AsyncMock(return_value="{}") with pytest.raises(ConfigEntryNotReady) as exc_info: # Call the domain-level function await async_setup_entry_domain(hass, mock_config_entry) assert "Missing 'name' in configuration." in str(exc_info.value) @pytest.mark.asyncio async def test_data_sensor_device_info(hass, mock_config_entry): """Test that data sensor reports correct device information.""" data = { "bins": [ {"type": "General Waste", "collectionDate": "15/10/2023"}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(data) icon_color_mapping = {} hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste_device_info", icon_color_mapping, ) expected_device_info = { "identifiers": {(DOMAIN, "test_general_waste_device_info")}, "name": "Test Name General Waste", "manufacturer": "UK Bin Collection", "model": "Bin Sensor", "sw_version": "1.0", } assert sensor.device_info == expected_device_info @pytest.mark.asyncio async def test_data_sensor_default_icon(hass, mock_config_entry): """Test data sensor uses default icon based on bin type when no mapping is provided.""" data = { "bins": [ {"type": "Unknown Bin", "collectionDate": "20/10/2023"}, ] } with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps(data) # No icon_color_mapping provided icon_color_mapping = {} hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() sensor = UKBinCollectionDataSensor( coordinator, "Unknown Bin", "test_unknown_bin", icon_color_mapping ) assert sensor.icon == "mdi:delete" assert sensor._color == "black" def test_coordinator_update_interval(hass, mock_config_entry): """Test that coordinator uses the correct update interval.""" coordinator = HouseholdBinCoordinator(hass, MagicMock(), "Test Name", timeout=60) assert coordinator.update_interval == timedelta(hours=12) @pytest.mark.asyncio async def test_manual_refresh_service(hass, mock_config_entry): """Test that calling manual_refresh logic triggers coordinator.async_request_refresh.""" # 1) Manually set up the coordinator (just like your other sensor tests). hass.data = {} hass.data.setdefault(DOMAIN, {}) # Create a coordinator with patch( "custom_components.uk_bin_collection.sensor.UKBinCollectionApp" ) as mock_app: mock_app_instance = mock_app.return_value mock_app_instance.run.return_value = json.dumps({"bins": []}) hass.async_add_executor_job = AsyncMock( return_value=mock_app_instance.run.return_value ) coordinator = HouseholdBinCoordinator( hass, mock_app_instance, "Test Name", timeout=60 ) await coordinator.async_config_entry_first_refresh() # Store coordinator in hass.data hass.data[DOMAIN][mock_config_entry.entry_id] = {"coordinator": coordinator} # 2) Duplicate the essence of handle_manual_refresh, but pass in a mock ServiceCall async def mock_handle_manual_refresh(call: ServiceCall): entry_id = call.data.get("entry_id") if not entry_id: return if entry_id not in hass.data[DOMAIN]: return c = hass.data[DOMAIN][entry_id].get("coordinator") if c: await c.async_request_refresh() # 3) Patch coordinator.async_request_refresh to confirm it gets called with patch.object( coordinator, "async_request_refresh", new_callable=AsyncMock ) as mock_refresh: # Construct a mock ServiceCall that includes the entry_id fake_call = ServiceCall( domain=DOMAIN, service="manual_refresh", data={"entry_id": mock_config_entry.entry_id}, ) await mock_handle_manual_refresh(fake_call) mock_refresh.assert_awaited_once() def test_load_icon_color_mapping_invalid_json(): from custom_components.uk_bin_collection.sensor import load_icon_color_mapping invalid_json = ( '{"icon":"mdi:trash" "no_comma":true}' # invalid JSON (missing comma) ) with patch("logging.Logger.warning") as mock_warn: result = load_icon_color_mapping(invalid_json) # The function should return {} assert result == {} # Note the double space after the prefix – adjust to match the actual log message. mock_warn.assert_called_once_with( "[UKBinCollection] Invalid icon_color_mapping JSON: " f"{invalid_json}. Using default settings." ) @pytest.mark.asyncio async def test_bin_sensor_missing_bin_type(hass, mock_config_entry): """Test that we log a warning and set state to Unknown when the bin type is missing.""" # Suppose your coordinator’s data only has "Recycling" data = {"Recycling": datetime(2025, 2, 1).date()} # but the sensor is for "General Waste" # Create the coordinator coordinator = MagicMock() coordinator.data = data coordinator.name = "Test Name" sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_general_waste", {} ) with patch("logging.Logger.warning") as mock_warn: sensor.update_state() assert sensor.state == "Unknown" assert sensor.extra_state_attributes["days"] is None assert sensor.available is False mock_warn.assert_called_once_with( "[UKBinCollection] Data for bin type 'General Waste' is missing." ) @pytest.mark.asyncio async def test_attribute_sensor_undefined_attribute_type(hass, mock_config_entry): coordinator = MagicMock() coordinator.data = {"Recycling": datetime(2025, 1, 1).date()} coordinator.name = "Test Coordinator" sensor = UKBinCollectionAttributeSensor( coordinator=coordinator, bin_type="Recycling", unique_id="test_recycling_undefined", attribute_type="Bogus Attribute", # Will trigger the 'Undefined' path device_id="test_device", icon_color_mapping={}, ) with patch("logging.Logger.warning") as mock_warn: state = sensor.state assert state == "Undefined" mock_warn.assert_called_once_with( "[UKBinCollection] Undefined attribute type: Bogus Attribute" ) @pytest.mark.asyncio async def test_bin_sensor_in_x_days(hass, freezer, mock_config_entry): freezer.move_to("2023-10-14") # next_collection is 5 days away future_date = dt_util.now().date() + timedelta(days=5) coordinator = MagicMock() coordinator.data = {"General Waste": future_date} coordinator.name = "Test Coordinator" sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "test_gw_in_5_days", {} ) assert sensor.state == "In 5 days" def test_data_sensor_default_icon_unknown_type(): coordinator = MagicMock() coordinator.data = {"Some Custom Bin": datetime(2025, 1, 1).date()} coordinator.name = "Test Name" sensor = UKBinCollectionDataSensor(coordinator, "Unknown Type", "test_unknown", {}) assert sensor.icon == "mdi:delete" def test_raw_json_sensor_partial_data(): coordinator = MagicMock() # Only some bins have dates, e.g., "General Waste" is None coordinator.data = {"General Waste": None, "Recycling": datetime(2025, 1, 1).date()} coordinator.last_update_success = True sensor = UKBinCollectionRawJSONSensor(coordinator, "test_raw_json", "Test Name") state = sensor.state # Should JSON encode the 'None' bin assert state == '{"General Waste": null, "Recycling": "01/01/2025"}' def test_data_sensor_unavailable_if_unknown_state(): coordinator = MagicMock() coordinator.data = {} # no bins coordinator.name = "Test Coordinator" sensor = UKBinCollectionDataSensor(coordinator, "General Waste", "test_gw", {}) sensor.update_state() # triggers "Unknown" assert sensor.available is False def test_attribute_sensor_unavailable_if_coordinator_failed(): coordinator = MagicMock() coordinator.data = {"Recycling": datetime(2025, 1, 1).date()} coordinator.last_update_success = False coordinator.name = "Test Coordinator" attr_sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "test_attr_fail", "Colour", "device_id", {} ) assert attr_sensor.available is False import pytest from unittest.mock import MagicMock from datetime import datetime from custom_components.uk_bin_collection.sensor import ( create_sensor_entities, UKBinCollectionDataSensor, UKBinCollectionAttributeSensor, UKBinCollectionRawJSONSensor, ) @pytest.mark.asyncio def test_create_sensor_entities_coordinator_data(): # Set up a coordinator with two bin types. coordinator = MagicMock() # For example, suppose today is 2025-02-08: coordinator.data = { "General Waste": date(2025, 2, 8), # Today "Recycling": date(2025, 2, 9), # Tomorrow } coordinator.name = "Test Coordinator" # Use a valid JSON mapping for General Waste only. icon_mapping_json = '{"General Waste":{"icon":"mdi:trash-can","color":"brown"}}' entities = create_sensor_entities(coordinator, "test_entry", icon_mapping_json) # We expect: # 2 main sensors (one per bin type), # 2 * 5 = 10 attribute sensors, # 1 raw JSON sensor, # Total 13 entities. assert len(entities) == 13 # Check that for "General Waste", the icon from the mapping is used. gw_sensor = next( e for e in entities if isinstance(e, UKBinCollectionDataSensor) and "General Waste" in e.name ) assert gw_sensor.icon == "mdi:trash-can" # And its attribute sensors (e.g., "Days Until Collection") can be tested: gw_attr_sensor = next( e for e in entities if isinstance(e, UKBinCollectionAttributeSensor) and "Days Until Collection" in e.name ) # Trigger the state logic (which calls calculate_days_until) days_until = gw_attr_sensor.state # In our example, if today is 2025-02-08 and collection is today for General Waste, # days would be 0 (or if we adjust coordinator.data, you can compare against the expected value) # For this test we simply check that a value is returned (or you can be more specific if you set the dates) assert days_until is not None # Also, verify that a raw JSON sensor exists. raw_sensor = next( e for e in entities if isinstance(e, UKBinCollectionRawJSONSensor) ) # Its state should be a JSON string containing keys for each bin type. raw_state = json.loads(raw_sensor.state) assert "General Waste" in raw_state and "Recycling" in raw_state def test_create_sensor_entities_invalid_icon_json(): coordinator = MagicMock() coordinator.data = { "General Waste": datetime(2025, 2, 10).date(), } coordinator.name = "Test Coordinator" invalid_json = '{"invalid":true, ' # incomplete JSON with patch("logging.Logger.warning") as mock_warn: entities = create_sensor_entities(coordinator, "test_entry_id", invalid_json) # We still get 1 main sensor + 5 attribute sensors + 1 raw sensor => 7 total assert len(entities) == 7 mock_warn.assert_called_once() # e.g. "... Invalid icon_color_mapping JSON: ... Using default settings." @pytest.mark.asyncio @freeze_time("2025-02-8") # let's say "today" is 2025-02-8 def test_attribute_sensor_days_and_human_readable(): coordinator = MagicMock() # Pretend "Food Waste" is 2 days away in_2_days = datetime(2025, 2, 10).date() coordinator.data = {"Food Waste": in_2_days} coordinator.name = "Coordinator Name" # Create sensors for that bin entities = create_sensor_entities(coordinator, "entry_id_days", "{}") # Find the attribute sensors for "Days Until Collection" & "Next Collection Human Readable" days_sensor = next( e for e in entities if isinstance(e, UKBinCollectionAttributeSensor) and "Days Until Collection" in e.name ) human_sensor = next( e for e in entities if isinstance(e, UKBinCollectionAttributeSensor) and "Next Collection Human Readable" in e.name ) # The .state property triggers the logic days_state = days_sensor.state human_state = human_sensor.state # If today is e.g. 2025-02-08, in_2_days is 2025-02-10 => that's 2 days away # "Days Until Collection" => 2 # "Next Collection Human Readable" => "In 2 days" (assuming 2 != 1 => "days") assert days_state == 2 assert human_state == "In 2 days" def test_data_sensor_coordinator_update(): coordinator = MagicMock() coordinator.data = {"General Waste": datetime(2025, 2, 10).date()} coordinator.name = "Coordinator Name" sensor = UKBinCollectionDataSensor(coordinator, "General Waste", "device_id", {}) with patch.object(sensor, "update_state") as mock_update, patch.object( sensor, "async_write_ha_state" ) as mock_write: sensor._handle_coordinator_update() mock_update.assert_called_once() mock_write.assert_called_once() @freeze_time("2025-02-10") # let's say "today" is 2025-02-10 def test_data_sensor_today_tomorrow(): coordinator = MagicMock() # Make 2 bins: one is "today" (2025-02-10), one is "tomorrow" (2025-02-11) coordinator.data = { "Waste Today": datetime(2025, 2, 10).date(), "Waste Tomorrow": datetime(2025, 2, 11).date(), } coordinator.name = "Coord" # create sensors entities = create_sensor_entities(coordinator, "entry_id", "{}") tdy_sensor = next( e for e in entities if isinstance(e, UKBinCollectionDataSensor) and "Waste Today" in e.name ) tmw_sensor = next( e for e in entities if isinstance(e, UKBinCollectionDataSensor) and "Waste Tomorrow" in e.name ) assert tdy_sensor.state == "Today" assert tmw_sensor.state == "Tomorrow" @freeze_time("2025-02-08") def test_create_sensor_entities_full_coverage(hass): coordinator = MagicMock() coordinator.data = { "General Waste": datetime(2025, 2, 8).date(), # Today "Recycling": datetime(2025, 2, 9).date(), # Tomorrow "Garden": datetime(2025, 2, 10).date(), # 2 days } coordinator.name = "Full Coverage Coord" # Intentionally pass an invalid JSON to load_icon_color_mapping invalid_icon_json = '{"General Waste": {"icon":"mdi:trash-can"}, "broken"' with patch("logging.Logger.warning") as mock_warn: entities = create_sensor_entities( coordinator, "entry_id_abc", invalid_icon_json ) # We get main sensors for 3 bins => 3 # Each bin has 5 attribute sensors => 15 # 1 raw sensor => 1 # total => 19 assert len(entities) == 19 # Check the warning for invalid JSON was called mock_warn.assert_called_once() # e.g. "Invalid icon_color_mapping JSON: ... Using default settings." # Now pick an attribute sensor for "Days Until Collection" for "Garden" days_garden = next(e for e in entities if "Garden Days Until Collection" in e.name) # Trigger the state property => calls `calculate_days_until` days_val = days_garden.state assert days_val == 2 # Because 2025-02-10 is 2 days from 2025-02-08 # Similarly, the raw JSON sensor raw_sensor = next( e for e in entities if isinstance(e, UKBinCollectionRawJSONSensor) ) raw_state = raw_sensor.state # Should be a JSON with 3 keys, etc. # This triggers lines in raw-sensor code # Also test `_handle_coordinator_update` on the main sensor main_sensor = next( e for e in entities if "General Waste" in e.name and isinstance(e, UKBinCollectionDataSensor) ) with patch.object(main_sensor, "update_state") as mock_up, patch.object( main_sensor, "async_write_ha_state" ) as mock_aw: main_sensor._handle_coordinator_update() mock_up.assert_called_once() mock_aw.assert_called_once() ############################################################################### # Tests for UKBinCollectionAttributeSensor's state and helper methods ############################################################################### def test_attribute_sensor_state_colour(): """Test that if attribute type is 'Colour', state returns _color.""" coordinator = MagicMock() # Provide some bin data though it isn’t used in this branch coordinator.data = {"Recycling": datetime(2025, 2, 10).date()} coordinator.name = "Test Coord" # Provide a mapping that supplies a color. icon_mapping = {"Recycling": {"icon": "mdi:recycle", "color": "green"}} sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid1", "Colour", "dev1", icon_mapping ) # The state for attribute "Colour" is simply the color. assert sensor.state == "green" def test_attribute_sensor_state_bin_type(): """Test that if attribute type is 'Bin Type', state returns the bin type.""" coordinator = MagicMock() coordinator.data = {"Recycling": datetime(2025, 2, 10).date()} coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid2", "Bin Type", "dev2", {} ) assert sensor.state == "Recycling" def test_attribute_sensor_state_next_collection_date_with_data(): """Test that if attribute type is 'Next Collection Date' and data exists, state is the formatted date.""" date_value = datetime(2025, 2, 10).date() coordinator = MagicMock() coordinator.data = {"Recycling": date_value} coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid3", "Next Collection Date", "dev3", {} ) expected = date_value.strftime("%d/%m/%Y") assert sensor.state == expected def test_attribute_sensor_state_next_collection_date_no_data(): """Test that if attribute type is 'Next Collection Date' and no data exists, state is 'Unknown'.""" coordinator = MagicMock() coordinator.data = {} coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid4", "Next Collection Date", "dev4", {} ) assert sensor.state == "Unknown" @freeze_time("2025-02-08") def test_attribute_sensor_state_next_collection_human_readable_today(): """Test human‐readable state when bin collection is today.""" coordinator = MagicMock() coordinator.data = {"Recycling": datetime(2025, 2, 8).date()} coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid5", "Next Collection Human Readable", "dev5", {} ) # When the collection date is today, expect "Today" assert sensor.state == "Today" @freeze_time("2025-02-08") def test_attribute_sensor_state_next_collection_human_readable_tomorrow(): """Test human‐readable state when bin collection is tomorrow.""" coordinator = MagicMock() coordinator.data = {"Recycling": datetime(2025, 2, 9).date()} coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid6", "Next Collection Human Readable", "dev6", {} ) assert sensor.state == "Tomorrow" @freeze_time("2025-02-08") def test_attribute_sensor_state_next_collection_human_readable_future(): """Test human‐readable state when bin collection is more than one day away.""" coordinator = MagicMock() coordinator.data = {"Recycling": datetime(2025, 2, 12).date()} # 4 days later coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid7", "Next Collection Human Readable", "dev7", {} ) # 2025-02-12 is 4 days away from 2025-02-08 assert sensor.state == "In 4 days" @freeze_time("2025-02-08") def test_attribute_sensor_state_days_until_collection_with_data(): """Test that Days Until Collection returns the correct number of days.""" coordinator = MagicMock() coordinator.data = {"Recycling": datetime(2025, 2, 11).date()} # 3 days away coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid8", "Days Until Collection", "dev8", {} ) assert sensor.state == 3 @freeze_time("2025-02-08") def test_attribute_sensor_state_days_until_collection_no_data(): """Test that Days Until Collection returns -1 if no data is available.""" coordinator = MagicMock() coordinator.data = {} coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid9", "Days Until Collection", "dev9", {} ) assert sensor.state == -1 ############################################################################### # Tests for extra_state_attributes, device_info, and unique_id properties ############################################################################### def test_data_sensor_extra_state_attributes(): """Test that extra_state_attributes returns the correct dictionary.""" coordinator = MagicMock() date_value = datetime(2025, 2, 10).date() coordinator.data = {"Recycling": date_value} coordinator.name = "Test Coord" sensor = UKBinCollectionDataSensor(coordinator, "Recycling", "uid10", {}) expected_attributes = { STATE_ATTR_COLOUR: sensor.get_color(), # without mapping, default is "black" STATE_ATTR_NEXT_COLLECTION: date_value.strftime("%d/%m/%Y"), STATE_ATTR_DAYS: (date_value - dt_util.now().date()).days, } assert sensor.extra_state_attributes == expected_attributes def test_data_sensor_device_info_property(): """Test that the device_info property returns the expected dictionary.""" coordinator = MagicMock() coordinator.name = "Test Name" sensor = UKBinCollectionDataSensor(coordinator, "General Waste", "device123", {}) expected = { "identifiers": {(DOMAIN, "device123")}, "name": f"{coordinator.name} General Waste", "manufacturer": "UK Bin Collection", "model": "Bin Sensor", "sw_version": "1.0", } assert sensor.device_info == expected def test_data_sensor_unique_id_property(): """Test that unique_id property returns the correct value.""" coordinator = MagicMock() sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "unique_id_123", {} ) assert sensor.unique_id == "unique_id_123" ############################################################################### # Tests for create_sensor_entities() helper function ############################################################################### def test_create_sensor_entities_coordinator_data(): """Test that create_sensor_entities returns the correct sensor entities.""" coordinator = MagicMock() # Suppose we have two bin types. coordinator.data = { "General Waste": date(2025, 2, 8), # Today "Recycling": date(2025, 2, 9), # Tomorrow } coordinator.name = "Test Coordinator" # Use a valid JSON mapping for General Waste only. icon_mapping_json = '{"General Waste":{"icon":"mdi:trash-can","color":"brown"}}' entities = create_sensor_entities(coordinator, "test_entry", icon_mapping_json) # Expect: # 2 main sensors (one for each bin type) # 2 * 5 = 10 attribute sensors (5 per bin type) # 1 raw JSON sensor # Total = 2 + 10 + 1 = 13 assert len(entities) == 13 # Verify that the General Waste sensor uses the icon mapping. gw_sensor = next( e for e in entities if isinstance(e, UKBinCollectionDataSensor) and "General Waste" in e.name ) assert gw_sensor.icon == "mdi:trash-can" # Check that one of the attribute sensors exists (e.g. Days Until Collection) gw_attr = next( e for e in entities if isinstance(e, UKBinCollectionAttributeSensor) and "Days Until Collection" in e.name ) assert gw_attr is not None # Verify that a raw JSON sensor is present. raw_sensor = next( e for e in entities if isinstance(e, UKBinCollectionRawJSONSensor) ) raw_state = json.loads(raw_sensor.state) assert "General Waste" in raw_state and "Recycling" in raw_state def test_create_sensor_entities_invalid_icon_json(): """Test that create_sensor_entities logs a warning when icon_color_mapping is invalid.""" coordinator = MagicMock() coordinator.data = { "General Waste": datetime(2025, 2, 10).date(), } coordinator.name = "Test Coordinator" invalid_json = '{"invalid":true, ' # Incomplete JSON with patch("logging.Logger.warning") as mock_warn: entities = create_sensor_entities(coordinator, "test_entry_id", invalid_json) # With one bin type we expect: 1 main sensor + 5 attribute sensors + 1 raw sensor = 7 total assert len(entities) == 7 mock_warn.assert_called_once() ############################################################################### # Additional tests for the attribute sensor calculation methods ############################################################################### @freeze_time("2025-02-08") def test_attribute_sensor_days_and_human_readable(): """Test that the attribute sensor returns correct human‐readable and days until collection.""" coordinator = MagicMock() # Suppose "Food Waste" is 2 days away from 2025-02-08 in_2_days = datetime(2025, 2, 10).date() coordinator.data = {"Food Waste": in_2_days} coordinator.name = "Coordinator Name" sensor = UKBinCollectionAttributeSensor( coordinator, "Food Waste", "uid_full", "Next Collection Human Readable", "dev_full", {}, ) # When there is data, the human-readable text should be "In 2 days" assert sensor.calculate_human_readable() == "In 2 days" # And calculate_days_until should return 2 assert sensor.calculate_days_until() == 2 ############################################################################### # Tests for the Raw JSON Sensor behavior ############################################################################### def test_raw_json_sensor_partial_data(): """Test that the raw JSON sensor correctly encodes None values.""" coordinator = MagicMock() # Only some bins have dates; for example, "General Waste" is None. coordinator.data = {"General Waste": None, "Recycling": datetime(2025, 1, 1).date()} coordinator.last_update_success = True sensor = UKBinCollectionRawJSONSensor(coordinator, "raw_uid", "Test Name") state = sensor.state # Expect that the None value is encoded as null in JSON. expected = '{"General Waste": null, "Recycling": "01/01/2025"}' assert state == expected def test_data_sensor_unavailable_if_unknown_state(): """Test that the sensor is marked unavailable when its state is 'Unknown'.""" coordinator = MagicMock() coordinator.data = {} # no bin data provided coordinator.name = "Test Coordinator" sensor = UKBinCollectionDataSensor(coordinator, "General Waste", "uid_unavail", {}) sensor.update_state() # This should set state to "Unknown" assert sensor.available is False def test_attribute_sensor_unavailable_if_coordinator_failed(): """Test that an attribute sensor is unavailable if the coordinator update failed.""" coordinator = MagicMock() coordinator.data = {"Recycling": datetime(2025, 1, 1).date()} coordinator.last_update_success = False coordinator.name = "Test Coordinator" sensor = UKBinCollectionAttributeSensor( coordinator, "Recycling", "uid_fail", "Colour", "dev_fail", {} ) assert sensor.available is False # --- Additional tests for uncovered lines --- def test_data_sensor_state_unknown_and_extra_attributes(): """Test that if no bin data is provided the state is 'Unknown' and extra_state_attributes are set correctly.""" # Create a coordinator with no data for the requested bin type. coordinator = MagicMock() coordinator.data = {} # No data available. coordinator.name = "Test Coord" # Create a data sensor for a bin type that is not in the coordinator data. sensor = UKBinCollectionDataSensor( coordinator, "Nonexistent Bin", "device_unknown", {} ) sensor.update_state() # This should set the state to "Unknown" # Verify the state fallback assert sensor.state == "Unknown" # Verify extra_state_attributes returns default values: # The colour is determined by get_color()—with no mapping it returns "black" extra = sensor.extra_state_attributes assert extra[STATE_ATTR_COLOUR] == "black" # Since there is no bin date, the next collection attribute should be None. assert extra[STATE_ATTR_NEXT_COLLECTION] is None def test_data_sensor_device_info_and_unique_id(): """Test that the device_info and unique_id properties return the expected values.""" coordinator = MagicMock() coordinator.name = "Test Coord" # Create a sensor with a given device ID. sensor = UKBinCollectionDataSensor( coordinator, "General Waste", "unique_id_test", {} ) expected_device_info = { "identifiers": {(DOMAIN, "unique_id_test")}, "name": f"{coordinator.name} General Waste", "manufacturer": "UK Bin Collection", "model": "Bin Sensor", "sw_version": "1.0", } # Test device_info property assert sensor.device_info == expected_device_info # Test unique_id property assert sensor.unique_id == "unique_id_test" # --- Additional tests for the Attribute Sensor helper methods --- @freeze_time("2025-02-08") def test_attribute_sensor_calculate_human_readable_and_days_until(): """Test the calculate_human_readable and calculate_days_until methods of the attribute sensor.""" # Suppose "Food Waste" is 3 days away from 2025-02-08. future_date = datetime(2025, 2, 11).date() coordinator = MagicMock() coordinator.data = {"Food Waste": future_date} coordinator.name = "Test Coord" sensor = UKBinCollectionAttributeSensor( coordinator, "Food Waste", "attr_uid", "Next Collection Human Readable", "dev_uid", {}, ) # Manually call the helper methods: human_readable = sensor.calculate_human_readable() days_until = sensor.calculate_days_until() # From 2025-02-08 to 2025-02-11 is 3 days away. assert human_readable == "In 3 days" assert days_until == 3 # --- Additional tests for create_sensor_entities helper function --- def test_create_sensor_entities_with_no_data(): """Test create_sensor_entities returns sensors even when coordinator data is empty.""" coordinator = MagicMock() coordinator.data = {} # No bin types at all. coordinator.name = "Empty Coord" # Pass an empty JSON mapping. entities = create_sensor_entities(coordinator, "empty_entry", "{}") # We expect only the Raw JSON sensor to be created. # (since the for-loop over coordinator.data will not iterate if data is empty) assert len(entities) == 1 assert isinstance(entities[0], UKBinCollectionRawJSONSensor) # --- Additional tests for load_icon_color_mapping default return --- def test_load_icon_color_mapping_empty_string(): """Test that load_icon_color_mapping returns an empty dict if an empty string is provided.""" result = load_icon_color_mapping("") assert result == {} # (The test for invalid JSON is already present; see test_load_icon_color_mapping_invalid_json) # --- Additional test for UKBinCollectionRawJSONSensor property behavior --- def test_raw_json_sensor_with_no_data(): """Test that the Raw JSON sensor returns '{}' when no coordinator data is available.""" coordinator = MagicMock() coordinator.data = {} coordinator.last_update_success = True sensor = UKBinCollectionRawJSONSensor(coordinator, "raw_test", "Test Name") assert sensor.state == "{}" # The extra_state_attributes should return an empty dict under the key "raw_data" assert sensor.extra_state_attributes == {"raw_data": {}} # Availability should depend on last_update_success assert sensor.available is True ================================================ FILE: custom_components/uk_bin_collection/translations/cy.json ================================================ { "title": "Data Casglu Biniau y DU", "config": { "step": { "user": { "title": "Dewiswch y cyngor", "data": { "name": "Enw'r lleoliad", "council": "Cyngor", "manual_refresh_only": "Adnewyddu'r synhwyrydd yn awtomatig", "icon_color_mapping": "JSON i fapio Math y Bin ar gyfer Lliw ac Eicon gweler: https://github.com/robbrad/UKBinCollectionData" }, "description": "Gweler [yma](https://github.com/robbrad/UKBinCollectionData#requesting-your-council) os nad yw eich cyngor wedi'i restru" }, "council": { "title": "Darparu manylion y cyngor", "data": { "url": "URL i nôl data casglu biniau", "timeout": "Yr amser mewn eiliadau y dylai'r synhwyrydd aros am ddata", "update_interval": "Amser mewn oriau rhwng diweddariadau", "uprn": "UPRN (Rhif Cyfeirnod Eiddo Unigryw)", "postcode": "Cod post y cyfeiriad", "number": "Rhif tŷ y cyfeiriad", "usrn": "USRN (Rhif Cyfeirnod Stryd Unigryw)", "web_driver": "I redeg ar weinydd Selenium o bell, ychwanegwch URL y Weinydd Selenium", "headless": "Rhedeg Selenium yn y modd heb ben (argymhellir)", "local_browser": "Peidiwch â rhedeg ar weinydd Selenium o bell, defnyddiwch osodiad lleol o Chrome yn lle", "submit": "Cyflwyno" }, "description": "Cyfeiriwch at gofnod [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) eich cyngor am fanylion ar beth i'w nodi.\n{selenium_message}" }, "reconfigure_confirm": { "title": "Diweddaru manylion y cyngor", "data": { "url": "URL i nôl data casglu biniau", "timeout": "Yr amser mewn eiliadau y dylai'r synhwyrydd aros am ddata", "update_interval": "Amser mewn oriau rhwng diweddariadau", "uprn": "UPRN (Rhif Cyfeirnod Eiddo Unigryw)", "postcode": "Cod post y cyfeiriad", "number": "Rhif tŷ y cyfeiriad", "usrn": "USRN (Rhif Cyfeirnod Stryd Unigryw)", "web_driver": "I redeg ar weinydd Selenium o bell, ychwanegwch URL y Weinydd Selenium", "headless": "Rhedeg Selenium yn y modd heb ben (argymhellir)", "manual_refresh_only": "Adnewyddu'r synhwyrydd yn awtomatig", "local_browser": "Peidiwch â rhedeg ar weinydd Selenium o bell, defnyddiwch osodiad lleol o Chrome yn lle", "icon_color_mapping": "JSON i fapio Math y Bin ar gyfer Lliw ac Eicon gweler: https://github.com/robbrad/UKBinCollectionData", "submit": "Cyflwyno" }, "description": "Cyfeiriwch at gofnod [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) eich cyngor am fanylion ar beth i'w nodi." } }, "error": { "name": "Rhowch enw lleoliad os gwelwch yn dda", "council": "Dewiswch gyngor os gwelwch yn dda", "selenium_unavailable": "❌ Nid yw gweinydd Selenium ar gael. Sicrhewch ei fod yn rhedeg yn http://localhost:4444 neu http://selenium:4444. [Canllaw Gosod](https://example.com/selenium-setup)", "chromium_not_found": "❌ Nid yw porwr Chromium wedi'i osod. Gosodwch Chromium neu Google Chrome os gwelwch yn dda. [Canllaw Gosod](https://example.com/chromium-install)" } } } ================================================ FILE: custom_components/uk_bin_collection/translations/en.json ================================================ { "title": "UK Bin Collection Data", "config": { "step": { "user": { "title": "Select the council", "data": { "name": "Location name", "council": "Council", "manual_refresh_only":"Automatically refresh the sensor", "icon_color_mapping": "JSON to map Bin Type for Colour and Icon (see documentation)" }, "description": "Please see the documentation if your council isn't listed" }, "council": { "title": "Provide council details", "data": { "url": "URL to fetch bin collection data", "timeout": "The time in seconds for how long the sensor should wait for data", "update_interval": "Time in hours between updates", "uprn": "UPRN (Unique Property Reference Number)", "postcode": "Postcode of the address", "number": "House number of the address", "usrn": "USRN (Unique Street Reference Number)", "web_driver": "To run on a remote Selenium Server add the Selenium Server URL", "headless": "Run Selenium in headless mode (recommended)", "local_browser": "Don't run on remote Selenium server, use local install of Chrome instead", "submit": "Submit" }, "description": "Please refer to your council's wiki entry for details on what to enter.\n{selenium_message}" }, "reconfigure_confirm": { "title": "Update council details", "data": { "url": "URL to fetch bin collection data", "timeout": "The time in seconds for how long the sensor should wait for data", "update_interval": "Time in hours between updates", "uprn": "UPRN (Unique Property Reference Number)", "postcode": "Postcode of the address", "number": "House number of the address", "usrn": "USRN (Unique Street Reference Number)", "web_driver": "To run on a remote Selenium Server add the Selenium Server URL", "headless": "Run Selenium in headless mode (recommended)", "local_browser": "Don't run on remote Selenium server, use local install of Chrome instead", "manual_refresh_only":"Automatically refresh the sensor", "icon_color_mapping": "JSON to map Bin Type for Colour and Icon (see documentation)", "submit": "Submit" }, "description": "Please refer to your council's wiki entry for details on what to enter." } }, "error": { "name": "Please enter a location name", "council": "Please select a council", "selenium_unavailable": "Selenium server is not accessible. Please ensure it is running at localhost:4444 or selenium:4444", "chromium_not_found": "Chromium browser is not installed. Please install Chromium or Google Chrome" } } } ================================================ FILE: custom_components/uk_bin_collection/translations/ga.json ================================================ { "title": "Sonraí Bailithe Binn RA", "config": { "step": { "user": { "title": "Roghnaigh an chomhairle", "data": { "name": "Ainm Suíomh", "council": "Comhairle", "manual_refresh_only": "Athnuaigh an braiteoir go huathoibríoch", "icon_color_mapping": "JSON chun Cineál Bin a mhapáil do Dath agus Deilbhín féach: https://github.com/robbrad/UKBinCollectionData" }, "description": "Féach [anseo](https://github.com/robbrad/UKBinCollectionData#requesting-your-council) mura bhfuil do chomhairle liostaithe" }, "council": { "title": "Sonraí na comhairle a sholáthar", "data": { "url": "URL chun sonraí bailithe bin a fháil", "timeout": "An t-am i soicindí don braiteoir fanacht le haghaidh sonraí", "update_interval": "Ùine ann an uairean eadar ùrachaidhean", "uprn": "UPRN (Uimhir Tagartha Aonair Maoine)", "postcode": "Cód poist an seoladh", "number": "Uimhir tí an seoladh", "usrn": "USRN (Uimhir Tagartha Sráide Uathúil)", "web_driver": "Chun rith ar Fhreastalaí Iargúlta Selenium cuir isteach URL an Fhreastalaí Selenium", "headless": "Rith Selenium i mód gan cheann (molta)", "local_browser": "Ná rith ar fhreastalaí iargúlta Selenium, úsáid suiteáil áitiúil de Chrome ina ionad", "submit": "Cuir isteach" }, "description": "Féach ar iontráil [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) do chomhairle le haghaidh sonraí ar cad atá le cur isteach.\n{selenium_message}" }, "reconfigure_confirm": { "title": "Nuashonraigh sonraí na comhairle", "data": { "url": "URL chun sonraí bailithe bin a fháil", "timeout": "An t-am i soicindí don braiteoir fanacht le haghaidh sonraí", "update_interval": "Ùine ann an uairean eadar ùrachaidhean", "uprn": "UPRN (Uimhir Tagartha Aonair Maoine)", "postcode": "Cód poist an seoladh", "number": "Uimhir tí an seoladh", "usrn": "USRN (Uimhir Tagartha Sráide Uathúil)", "web_driver": "Chun rith ar Fhreastalaí Iargúlta Selenium cuir isteach URL an Fhreastalaí Selenium", "headless": "Rith Selenium i mód gan cheann (molta)", "local_browser": "Ná rith ar fhreastalaí iargúlta Selenium, úsáid suiteáil áitiúil de Chrome ina ionad", "manual_refresh_only": "Athnuaigh an braiteoir go huathoibríoch", "icon_color_mapping": "JSON chun Cineál Bin a mhapáil do Dath agus Deilbhín féach: https://github.com/robbrad/UKBinCollectionData", "submit": "Cuir isteach" }, "description": "Féach ar iontráil [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) do chomhairle le haghaidh sonraí ar cad atá le cur isteach." } }, "error": { "name": "Cuir isteach ainm suíomh le do thoil", "council": "Roghnaigh comhairle le do thoil", "selenium_unavailable": "❌ Níl freastalaí Selenium inrochtana. Cinntigh go bhfuil sé ag rith ag http://localhost:4444 nó http://selenium:4444. [Treoir Socraithe](https://example.com/selenium-setup)", "chromium_not_found": "❌ Níl brabhsálaí Chromium suiteáilte. Suiteáil Chromium nó Google Chrome le do thoil. [Treoir Suiteála](https://example.com/chromium-install)" } } } ================================================ FILE: custom_components/uk_bin_collection/translations/gd.json ================================================ { "title": "Dàta Cruinneachadh Biona RA", "config": { "step": { "user": { "title": "Tagh a’ chomhairle", "data": { "name": "Ainm Àite", "council": "Comhairle", "manual_refresh_only": "Ùraich an sensor gu fèin-obrachail", "local_browser": "Na ruith air frithealaiche Selenium iomallach, cleachd stàladh ionadail de Chrome an àite", "icon_color_mapping": "JSON gus Seòrsa Biona a mhapadh airson Dath agus Ìomhaigh faic: https://github.com/robbrad/UKBinCollectionData" }, "description": "Feuch an toir thu sùil [an seo](https://github.com/robbrad/UKBinCollectionData#requesting-your-council) mura h-eil do chomhairle air a liostadh" }, "council": { "title": "Thoir seachad mion-fhiosrachadh na comhairle", "data": { "url": "URL gus dàta cruinneachadh biona fhaighinn", "timeout": "An ùine ann an diogan airson cho fada bu chòir don sensor feitheamh airson dàta", "update_interval": "Am i n-uaireanta idir nuashonruithe", "uprn": "UPRN (Àireamh Iomraidh Seilbh Aonraic)", "postcode": "Còd-puist an t-seòladh", "number": "Àireamh an taighe den t-seòladh", "usrn": "USRN (Àireamh Iomraidh Sràid Aonraic)", "web_driver": "Gus ruith air Frithealaiche Selenium iomallach cuir a-steach an URL Frithealaiche Selenium", "headless": "Ruith Selenium ann am modh gun cheann (air a mholadh)", "local_browser": "Na ruith air frithealaiche Selenium iomallach, cleachd stàladh ionadail de Chrome an àite", "submit": "Cuir a-steach" }, "description": "Feuch an toir thu sùil air inntrigeadh [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) na comhairle agad airson mion-fhiosrachadh air dè a chur a-steach.\n{selenium_message}" }, "reconfigure_confirm": { "title": "Ùraich mion-fhiosrachadh na comhairle", "data": { "url": "URL gus dàta cruinneachadh biona fhaighinn", "timeout": "An ùine ann an diogan airson cho fada bu chòir don sensor feitheamh airson dàta", "update_interval": "Am i n-uaireanta idir nuashonruithe", "uprn": "UPRN (Àireamh Iomraidh Seilbh Aonraic)", "postcode": "Còd-puist an t-seòladh", "number": "Àireamh an taighe den t-seòladh", "usrn": "USRN (Àireamh Iomraidh Sràid Aonraic)", "web_driver": "Gus ruith air Frithealaiche Selenium iomallach cuir a-steach an URL Frithealaiche Selenium", "headless": "Ruith Selenium ann am modh gun cheann (air a mholadh)", "manual_refresh_only": "Ùraich an sensor gu fèin-obrachail", "local_browser": "Na ruith air frithealaiche Selenium iomallach, cleachd stàladh ionadail de Chrome an àite", "icon_color_mapping": "JSON gus Seòrsa Biona a mhapadh airson Dath agus Ìomhaigh faic: https://github.com/robbrad/UKBinCollectionData", "submit": "Cuir a-steach" }, "description": "Feuch an toir thu sùil air inntrigeadh [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) na comhairle agad airson mion-fhiosrachadh air dè a chur a-steach." } }, "error": { "name": "Cuir a-steach ainm àite mas e do thoil e", "council": "Tagh comhairle mas e do thoil e", "selenium_unavailable": "❌ Chan eil frithealaiche Selenium ruigsinneach. Dèan cinnteach gu bheil e a’ ruith aig http://localhost:4444 no http://selenium:4444. [Stiùireadh Stèidheachaidh](https://example.com/selenium-setup)", "chromium_not_found": "❌ Chan eil brabhsair Chromium air a chuir a-steach. Stàlaich Chromium no Google Chrome mas e do thoil e. [Stiùireadh Stàlaidh](https://example.com/chromium-install)" } } } ================================================ FILE: custom_components/uk_bin_collection/translations/pt.json ================================================ { "title": "Dados de Coleta de Lixo do Reino Unido", "config": { "step": { "user": { "title": "Selecione o conselho", "data": { "name": "Nome da localização", "council": "Conselho", "manual_refresh_only": "Atualize o sensor automaticamente", "icon_color_mapping": "JSON para mapear Tipo de Lixo para Cor e Ícone veja: https://github.com/robbrad/UKBinCollectionData" }, "description": "Por favor, veja [aqui](https://github.com/robbrad/UKBinCollectionData#requesting-your-council) se o seu conselho não estiver listado" }, "council": { "title": "Forneça os detalhes do conselho", "data": { "url": "URL para buscar dados de coleta de lixo", "timeout": "O tempo em segundos que o sensor deve esperar por dados", "update_interval": "Tempo em horas entre as atualizações", "uprn": "UPRN (Número de Referência Único da Propriedade)", "postcode": "Código postal do endereço", "number": "Número da casa do endereço", "usrn": "USRN (Número de Referência Único da Rua)", "web_driver": "Para executar em um Servidor Selenium remoto, adicione o URL do Servidor Selenium", "headless": "Execute o Selenium no modo sem cabeça (recomendado)", "local_browser": "Não execute no servidor Selenium remoto, use a instalação local do Chrome em vez disso", "submit": "Enviar" }, "description": "Por favor, consulte a entrada [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) do seu conselho para detalhes sobre o que inserir.\n{selenium_message}" }, "reconfigure_confirm": { "title": "Atualizar detalhes do conselho", "data": { "url": "URL para buscar dados de coleta de lixo", "timeout": "O tempo em segundos que o sensor deve esperar por dados", "update_interval": "Tempo em horas entre as atualizações", "uprn": "UPRN (Número de Referência Único da Propriedade)", "postcode": "Código postal do endereço", "number": "Número da casa do endereço", "usrn": "USRN (Número de Referência Único da Rua)", "web_driver": "Para executar em um Servidor Selenium remoto, adicione o URL do Servidor Selenium", "headless": "Execute o Selenium no modo sem cabeça (recomendado)", "manual_refresh_only": "Atualize o sensor automaticamente", "local_browser": "Não execute no servidor Selenium remoto, use a instalação local do Chrome em vez disso", "icon_color_mapping": "JSON para mapear Tipo de Lixo para Cor e Ícone veja: https://github.com/robbrad/UKBinCollectionData", "submit": "Enviar" }, "description": "Por favor, consulte a entrada [wiki](https://github.com/robbrad/UKBinCollectionData/wiki/Councils) do seu conselho para detalhes sobre o que inserir." } }, "error": { "name": "Por favor, insira um nome de localização", "council": "Por favor, selecione um conselho", "selenium_unavailable": "❌ O servidor Selenium não está acessível. Por favor, certifique-se de que está em execução em http://localhost:4444 ou http://selenium:4444. [Guia de Configuração](https://example.com/selenium-setup)", "chromium_not_found": "❌ O navegador Chromium não está instalado. Por favor, instale o Chromium ou o Google Chrome. [Guia de Instalação](https://example.com/chromium-install)" } } } ================================================ FILE: docs/RELEASE-SETUP-SUMMARY.md ================================================ # Release Workflow Setup Summary ## What You Need to Do Your release workflow has been updated to use a GitHub App for secure, automated releases. Here's what you need to do to get it working: ### 1. Create GitHub App (5 minutes) Follow the detailed guide: [GitHub App Setup Guide](./github-app-setup.md) **Quick steps:** 1. Go to https://github.com/settings/apps/new 2. Fill in: - Name: `UKBinCollection Release Bot` (or similar unique name) - Homepage: `https://github.com/robbrad/UKBinCollectionData` - Uncheck "Webhook Active" - Permissions: **Contents** = Read and write 3. Click "Create GitHub App" 4. Click "Install App" → Select your repository 5. Generate a private key (downloads a `.pem` file) ### 2. Add Secrets to Repository (2 minutes) Go to: https://github.com/robbrad/UKBinCollectionData/settings/secrets/actions Add two secrets: **APP_ID** - Value: The App ID shown at top of app settings (e.g., `123456`) **APP_PRIVATE_KEY** - Value: Entire contents of the `.pem` file - Include the `-----BEGIN RSA PRIVATE KEY-----` and `-----END RSA PRIVATE KEY-----` lines ### 3. Test It (5 minutes) 1. Create a test branch: ```bash git checkout -b test/release-workflow ``` 2. Make a small change with a conventional commit: ```bash echo "# Test" >> README.md git add README.md git commit -m "fix: test release workflow" git push origin test/release-workflow ``` 3. Create and merge a PR on GitHub 4. Watch the workflows run: - Bump workflow should run and create a tag - Release workflow should publish to PyPI 5. Verify: - Check https://github.com/robbrad/UKBinCollectionData/releases - Check https://pypi.org/project/uk-bin-collection/ ## What Changed ### Workflows Updated - ✅ `.github/workflows/bump.yml` - Now uses GitHub App token - ✅ `.github/workflows/release.yml` - Cleaned up - ✅ `.github/workflows/validate-release-ready.yml` - Simplified ### Documentation Updated - ✅ `docs/release-workflow.md` - Main documentation - ✅ `docs/release-workflow-setup-checklist.md` - Setup checklist - ✅ `docs/release-quick-reference.md` - Quick reference - ✅ `docs/github-app-setup.md` - **NEW** - Detailed GitHub App guide - ✅ `docs/release-workflow-migration.md` - Migration guide ### Configuration Updated - ✅ `pyproject.toml` - Enhanced Commitizen config ## How It Works Now ``` 1. Developer creates PR with conventional commits (feat:, fix:, etc.) 2. CI validates commits and runs tests 3. PR gets merged to master 4. Bump workflow automatically: - Analyzes commits with Commitizen - Updates version in all files - Updates CHANGELOG.md - Creates commit and tag - Pushes to master (using GitHub App to bypass protection) 5. Release workflow automatically: - Builds package - Publishes to PyPI - Creates GitHub release ``` **Everything is automated after merge!** ## Benefits ✅ **More secure** - GitHub App instead of personal token ✅ **No expiration** - Tokens auto-refresh ✅ **Fully automated** - No manual steps after PR merge ✅ **Version syncing** - Commitizen handles all version files ✅ **Better changelog** - Auto-generated from commits ✅ **Simpler** - Fewer secrets to manage ## Troubleshooting ### Bump workflow fails with "Bad credentials" - Check that `APP_PRIVATE_KEY` includes the full key with BEGIN/END lines - Verify no extra spaces or line breaks were added ### Bump workflow fails with "Resource not accessible" - Verify the GitHub App has "Contents: Read and write" permission - Check that the app is installed on the repository ### Release didn't trigger - Check if the tag was created (look at bump workflow logs) - Verify the tag follows the format `X.Y.Z` (no `v` prefix) ## Need Help? - 📖 [Full Documentation](./release-workflow.md) - 🔧 [GitHub App Setup Guide](./github-app-setup.md) - ✅ [Setup Checklist](./release-workflow-setup-checklist.md) - ⚡ [Quick Reference](./release-quick-reference.md) ## Next Steps 1. ✅ Complete the GitHub App setup 2. ✅ Add the secrets to your repository 3. ✅ Test with a small PR 4. ✅ Monitor the first few releases 5. ✅ Remove old `PERSONAL_ACCESS_TOKEN` if you had one 6. ✅ Update team documentation That's it! Your release workflow is now simplified and more secure. ================================================ FILE: docs/deploy-key-setup.md ================================================ # Deploy Key Setup Guide Since the GitHub App bypass feature isn't available on your plan, we'll use a deploy key instead. This works on all GitHub plans and can bypass branch protection. ## Step 1: Generate SSH Key Pair On your local machine (Windows), open PowerShell and run: ```powershell ssh-keygen -t ed25519 -C "github-actions-deploy-key" -f ukbcd-deploy-key -N "" ``` This creates two files: - `ukbcd-deploy-key` (private key) - `ukbcd-deploy-key.pub` (public key) ## Step 2: Add Public Key to Repository 1. Go to: https://github.com/robbrad/UKBinCollectionData/settings/keys 2. Click **"Add deploy key"** 3. Fill in: - **Title**: `Release Workflow Deploy Key` - **Key**: Paste the contents of `ukbcd-deploy-key.pub` - **✅ Allow write access** - IMPORTANT: Check this box! 4. Click **"Add key"** ## Step 3: Add Private Key as Secret 1. Go to: https://github.com/robbrad/UKBinCollectionData/settings/secrets/actions 2. Click **"New repository secret"** 3. Fill in: - **Name**: `DEPLOY_KEY` - **Value**: Paste the entire contents of `ukbcd-deploy-key` (the private key, not .pub) 4. Click **"Add secret"** ## Step 4: Update Branch Protection Deploy keys with write access can bypass branch protection automatically, but you need to ensure: 1. Go to: https://github.com/robbrad/UKBinCollectionData/settings/branch_protection_rules 2. Edit your master branch rule 3. Make sure **"Do not allow bypassing the above settings"** is UNCHECKED - Or if you see **"Include administrators"**, UNCHECK it 4. Save changes ## Step 5: Clean Up After adding the keys to GitHub, delete the local key files: ```powershell Remove-Item ukbcd-deploy-key Remove-Item ukbcd-deploy-key.pub ``` ## Step 6: Remove Old Secrets (Optional) Since we're not using the GitHub App anymore, you can remove: - `APP_ID` - `APP_PRIVATE_KEY` Or keep them for future use. ## Test It 1. Create a test PR with a conventional commit 2. Merge it 3. Watch the bump workflow run 4. It should now successfully push to master ## How It Works - Deploy keys with write access can push to protected branches - The SSH key authenticates the workflow - No need for GitHub App or PAT - Works on all GitHub plans (Free, Pro, Team, Enterprise) ## Troubleshooting ### "Permission denied (publickey)" - Check that `DEPLOY_KEY` secret contains the private key (not the .pub file) - Verify the deploy key is added to the repository with write access ### Still getting "Protected branch update failed" - Ensure "Allow write access" is checked on the deploy key - Uncheck "Do not allow bypassing the above settings" in branch protection ### "Host key verification failed" - This shouldn't happen with GitHub, but if it does, the workflow will handle it automatically ## Security Notes ✅ Deploy key only has access to this one repository ✅ Can be revoked anytime from repository settings ✅ More secure than personal access tokens ✅ Doesn't expire ## Alternative: Personal Access Token If deploy keys don't work, you can use a PAT: 1. Create token at: https://github.com/settings/tokens/new 2. Select scope: `repo` 3. Add as secret: `PERSONAL_ACCESS_TOKEN` 4. Update workflow to use `token: ${{ secrets.PERSONAL_ACCESS_TOKEN }}` But deploy keys are preferred for single-repository automation. ================================================ FILE: docs/example_council.md ================================================ # Example Council Implementation This document shows how to implement a council class using the new utilities. ## Basic Structure ```python from uk_bin_collection.uk_bin_collection.utils.retry import retry from uk_bin_collection.uk_bin_collection.utils.cache import cached from uk_bin_collection.uk_bin_collection.utils.http_client import get as http_get from uk_bin_collection.uk_bin_collection.utils.logger import get_logger # Get a logger for this module logger = get_logger(__name__) class ExampleCouncil: """Example council implementation using the new utilities.""" # Required class variables postcode_required = True paon_required = True def __init__(self, url): self.url = url self.postcode = None self.paon = None @cached(ttl=3600) # Cache for 1 hour @retry(tries=3, delay=1, backoff=2) def get_data(self): """Get bin collection data for this council.""" logger.info(f"Fetching data for postcode {self.postcode}") # Construct the URL with parameters params = { "postcode": self.postcode, "house_number": self.paon } # Make the request with automatic retry response = http_get(self.url, params=params) # Process the response # ... # Return the data in the standard format return [ { "type": "General Waste", "date": "01/01/2023" }, { "type": "Recycling", "date": "08/01/2023" } ] ``` ## Complete Example For a complete example, see the updated council class template: `/workspaces/UKBinCollectionData/uk_bin_collection/uk_bin_collection/councils/council_class_template/councilclasstemplate.py` ================================================ FILE: docs/github-app-setup.md ================================================ # GitHub App Setup Guide This guide walks you through creating and configuring a GitHub App to allow the release workflow to bypass branch protection rules. ## Why Use a GitHub App? ✅ **More secure** - Fine-grained permissions, not tied to a user account ✅ **No expiration** - Tokens are automatically refreshed ✅ **Better audit trail** - Shows as the app, not a personal account ✅ **Team-friendly** - Won't break if someone leaves the team ✅ **Built-in bypass** - Can push to protected branches ## Step-by-Step Setup ### Step 1: Create the GitHub App 1. **Navigate to GitHub App settings:** - Personal account: https://github.com/settings/apps/new - Organization: https://github.com/organizations/YOUR_ORG/settings/apps/new 2. **Fill in the basic information:** - **GitHub App name**: `UKBinCollection Release Bot` (must be globally unique) - If taken, try: `UKBinCollection-Release-Bot-YourUsername` - **Homepage URL**: `https://github.com/robbrad/UKBinCollectionData` - **Description** (optional): `Automated release workflow for UK Bin Collection Data` 3. **Configure webhook:** - **Uncheck** "Active" under "Webhook" - We don't need webhooks for this use case 4. **Set repository permissions:** - **Contents**: `Read and write` ✅ (Required - to push commits and tags) - **Metadata**: `Read-only` (Automatically selected) - **Pull requests**: `Read and write` (Optional - for future features) 5. **Where can this GitHub App be installed?** - Select: **"Only on this account"** 6. **Click "Create GitHub App"** ### Step 2: Install the App 1. After creation, you'll see the app settings page 2. Click **"Install App"** in the left sidebar 3. Click **"Install"** next to your account/organization name 4. Choose installation scope: - Select **"Only select repositories"** - Check `UKBinCollectionData` 5. Click **"Install"** ### Step 3: Generate Private Key 1. Go back to the app settings page (Settings → Developer settings → GitHub Apps → Your App) 2. Scroll down to the **"Private keys"** section 3. Click **"Generate a private key"** 4. A `.pem` file will download automatically 5. **Save this file securely** - you'll need it in the next step ### Step 4: Get Your App Credentials You need two pieces of information: #### App ID - Found at the top of your app settings page - Example: `123456` - Copy this number #### Private Key - Open the downloaded `.pem` file in a text editor - Copy the **entire contents**, including: ``` -----BEGIN RSA PRIVATE KEY----- [long string of characters] -----END RSA PRIVATE KEY----- ``` ### Step 5: Add Secrets to Repository 1. Go to your repository: https://github.com/robbrad/UKBinCollectionData 2. Navigate to: **Settings → Secrets and variables → Actions** 3. Click **"New repository secret"** #### Add APP_ID Secret - **Name**: `APP_ID` - **Value**: Your App ID (e.g., `123456`) - Click "Add secret" #### Add APP_PRIVATE_KEY Secret - **Name**: `APP_PRIVATE_KEY` - **Value**: Paste the entire contents of the `.pem` file - **Important**: Include the `-----BEGIN RSA PRIVATE KEY-----` and `-----END RSA PRIVATE KEY-----` lines - Click "Add secret" ### Step 6: Verify Setup 1. The workflow is already configured to use these secrets 2. Test by merging a PR with a conventional commit message 3. Check the bump workflow logs to verify it runs successfully ## Troubleshooting ### "Bad credentials" error - **Cause**: Private key not copied correctly - **Solution**: Re-copy the entire `.pem` file contents, including BEGIN/END lines ### "Resource not accessible by integration" error - **Cause**: App doesn't have correct permissions - **Solution**: 1. Go to app settings 2. Check "Contents" permission is set to "Read and write" 3. Reinstall the app if needed ### "App not installed" error - **Cause**: App not installed on the repository - **Solution**: Go to https://github.com/settings/installations and install it ### Workflow still fails with permission error - **Cause**: Secrets not set correctly - **Solution**: 1. Verify `APP_ID` is just the number (no quotes) 2. Verify `APP_PRIVATE_KEY` includes the full key with headers 3. Check for extra spaces or line breaks ## Security Best Practices ✅ **Never commit** the `.pem` file to your repository ✅ **Store the `.pem` file** securely (password manager or secure vault) ✅ **Rotate keys** if compromised (generate new private key) ✅ **Review app permissions** periodically ✅ **Monitor app activity** in audit logs ## Managing the App ### View App Activity - Go to: Settings → Developer settings → GitHub Apps → Your App - Click "Advanced" tab to see delivery logs ### Regenerate Private Key 1. Go to app settings 2. Scroll to "Private keys" 3. Click "Generate a private key" 4. Update the `APP_PRIVATE_KEY` secret in your repository ### Uninstall/Reinstall - Go to: https://github.com/settings/installations - Click "Configure" next to your app - Adjust repository access or uninstall ## Alternative: Using a Personal Access Token If you prefer not to use a GitHub App, you can use a Personal Access Token instead: 1. Create a token at: https://github.com/settings/tokens/new 2. Select scope: `repo` (Full control of private repositories) 3. Add as secret: `BUMP_TOKEN` 4. Update workflow to use `${{ secrets.BUMP_TOKEN }}` instead of the app token However, GitHub Apps are recommended for production use. ## Next Steps Once setup is complete: 1. ✅ Test the workflow with a small PR 2. ✅ Monitor the first few releases 3. ✅ Document the app for your team 4. ✅ Set up monitoring/alerts for workflow failures ## Support If you encounter issues: - Check the [troubleshooting section](#troubleshooting) above - Review workflow logs in GitHub Actions - See the [main workflow documentation](./release-workflow.md) ================================================ FILE: docs/github-app-troubleshooting.md ================================================ # GitHub App Troubleshooting - Branch Protection ## Problem: "Protected branch update failed" If you're getting this error, the GitHub App isn't properly configured to bypass branch protection. ## Solution Steps ### Step 1: Verify App Installation 1. Go to: https://github.com/settings/installations 2. Find your app (e.g., "UKBinCollection Release Bot") 3. Click "Configure" 4. Verify it's installed on `UKBinCollectionData` repository 5. Check that it has "Contents: Read and write" permission ### Step 2: Configure Branch Protection Go to: https://github.com/robbrad/UKBinCollectionData/settings/branches Click "Edit" on your `master` branch protection rule. #### Option A: Allow App to Bypass (Recommended) Scroll to **"Allow specified actors to bypass required pull requests"**: - Click "Add" - Search for your app name - Select it - Click "Save changes" #### Option B: If App Doesn't Appear in Search The app might not show up in the bypass list. Instead: 1. Temporarily disable "Require a pull request before merging" 2. Test the workflow 3. Re-enable after confirming it works OR 1. Under "Restrict who can push to matching branches": - Enable it - Add your GitHub App - This allows the app to push directly ### Step 3: Verify App Permissions Go to your app settings: https://github.com/settings/apps 1. Click on your app 2. Verify permissions: - **Contents**: Read and write ✅ - **Metadata**: Read-only ✅ 3. If permissions are wrong, update them 4. Go to https://github.com/settings/installations 5. Click "Configure" on your app 6. Click "Update" to refresh permissions ### Step 4: Check Secrets Verify secrets are set correctly: 1. Go to: https://github.com/robbrad/UKBinCollectionData/settings/secrets/actions 2. Verify `APP_ID` exists 3. Verify `APP_PRIVATE_KEY` exists 4. The private key should include: ``` -----BEGIN RSA PRIVATE KEY----- [key content] -----END RSA PRIVATE KEY----- ``` ### Step 5: Test the Workflow Create a test PR and merge it to see if it works. ## Alternative: Temporary Workaround If you need to release immediately while fixing the app setup: ### Manual Release ```bash # 1. Pull latest git checkout master git pull # 2. Bump version cz bump --yes --changelog # 3. Push (you'll need to temporarily disable branch protection) git push origin master --follow-tags ``` ### Or Use Personal Access Token Temporarily Update `.github/workflows/bump.yml`: ```yaml - name: Checkout uses: actions/checkout@v5 with: fetch-depth: 0 token: ${{ secrets.PERSONAL_ACCESS_TOKEN }} ``` Then: 1. Create a PAT at: https://github.com/settings/tokens/new 2. Select scope: `repo` 3. Add as secret: `PERSONAL_ACCESS_TOKEN` 4. This will work until you fix the app setup ## Common Issues ### "Bad credentials" - Private key not copied correctly - Missing BEGIN/END lines - Extra spaces or line breaks ### "Resource not accessible by integration" - App doesn't have correct permissions - App not installed on repository - Need to update app permissions ### "App not found" - APP_ID is incorrect - App was deleted or renamed ## Still Not Working? If none of the above works, you have two options: ### Option 1: Use Personal Access Token (Quick Fix) See "Alternative: Temporary Workaround" above ### Option 2: Remove Branch Protection for Automation 1. Create a separate branch protection rule 2. Exclude `github-actions[bot]` from restrictions 3. This allows the workflow to push ## Need More Help? Check the workflow logs for the exact error message and search for it in: - GitHub Actions documentation - GitHub App documentation - Stack Overflow ================================================ FILE: docs/manual-tag-fix.md ================================================ # Manual Tag Fix for Version 0.155.0 Since the tag wasn't pushed, you need to manually create and push it to trigger the release. ## Steps: 1. **Pull the latest changes:** ```bash git checkout master git pull ``` 2. **Create the annotated tag:** ```bash git tag -a 0.155.0 -m "Release 0.155.0" ``` 3. **Push the tag:** ```bash git push origin 0.155.0 ``` 4. **Verify:** - Check tags: https://github.com/robbrad/UKBinCollectionData/tags - Watch the release workflow: https://github.com/robbrad/UKBinCollectionData/actions ## What Was Fixed 1. **bump.yml** - Changed from `--follow-tags` to separate push commands 2. **pyproject.toml** - Added `annotated_tag = true` to Commitizen config ## Future Releases The next merge will automatically: 1. Create annotated tag 2. Push commit and tag separately 3. Trigger release workflow 4. Publish to PyPI No manual intervention needed! ================================================ FILE: docs/release-quick-reference.md ================================================ # Release Workflow Quick Reference ## Commit Message Cheat Sheet | Type | Version Bump | Example | |------|--------------|---------| | `feat:` | Minor (0.152.0 → 0.153.0) | `feat(councils): add Leeds support` | | `fix:` | Patch (0.152.0 → 0.152.1) | `fix(selenium): handle timeout` | | `feat!:` or `BREAKING CHANGE:` | Major (0.152.0 → 1.0.0) | `feat!: change API format` | | `docs:` | None | `docs: update README` | | `style:` | None | `style: format code` | | `refactor:` | None | `refactor: simplify parser` | | `test:` | None | `test: add unit tests` | | `chore:` | None | `chore: update dependencies` | ## Workflow Stages ``` PR → Tests → Merge → Bump (auto) → Tag (auto) → Release (auto) → PyPI (auto) ``` Everything after merge is fully automated! ## How It Works 1. **Developer**: Create PR with conventional commits 2. **CI**: Validates commits and runs tests 3. **Merge**: PR merged to master 4. **Commitizen**: Analyzes commits, bumps version, updates CHANGELOG 5. **Git**: Creates tag and pushes 6. **Release**: Publishes to PyPI and GitHub ## Common Commands ```bash # Check current version poetry version -s # Validate before PR make pre-build # Run tests locally make unit-tests make integration-tests # Check commit messages git log --oneline # Manual bump (if needed) cz bump --yes --changelog git push origin master --follow-tags ``` ## Troubleshooting | Problem | Solution | |---------|----------| | Version bump didn't happen | Check commit message format (must use `feat:` or `fix:`) | | Release didn't trigger | Check if tag was created in bump workflow logs | | PyPI publish failed | Verify `PYPI_API_KEY` secret is set | | Permission error | Verify `APP_ID` and `APP_PRIVATE_KEY` secrets are set | | Version files out of sync | Commitizen handles this automatically | ## Required Secrets - `APP_ID` - GitHub App ID (for protected branches) - `APP_PRIVATE_KEY` - GitHub App private key - `PYPI_API_KEY` - For PyPI publishing - `CODECOV_TOKEN` - For test coverage (optional) **Note:** Uses GitHub App for secure, non-expiring authentication ## Workflow Files - `behave_pull_request.yml` - PR tests - `lint.yml` - Commit message validation - `validate-release-ready.yml` - Pre-merge checks - `bump.yml` - Automated version bumping and tagging - `release.yml` - Publishing to PyPI and GitHub ## Version Files (Auto-Synced) Commitizen automatically updates: - `pyproject.toml` - `custom_components/uk_bin_collection/manifest.json` - `custom_components/uk_bin_collection/const.py` - `CHANGELOG.md` ## Quick Links - [Full Documentation](./release-workflow.md) - [Setup Checklist](./release-workflow-setup-checklist.md) - [Conventional Commits](https://www.conventionalcommits.org/) - [Semantic Versioning](https://semver.org/) - [Commitizen](https://commitizen-tools.github.io/commitizen/) ================================================ FILE: docs/release-workflow-branch-protection.md ================================================ ================================================ FILE: docs/release-workflow-diagram.md ================================================ # Release Workflow Diagram ``` ┌─────────────────────────────────────────────────────────────────────────────┐ │ PULL REQUEST STAGE │ └─────────────────────────────────────────────────────────────────────────────┘ Developer creates PR → master ↓ ┌───────────────────────────────────────────────────────────┐ │ Automated Checks Run in Parallel: │ │ │ │ ✓ behave_pull_request.yml │ │ - Unit tests (Python 3.12) │ │ - Integration tests (changed councils only) │ │ - Parity check (councils/input.json/features) │ │ │ │ ✓ lint.yml │ │ - Validate conventional commit messages │ │ │ │ ✓ validate-release-ready.yml │ │ - Check version file consistency │ │ - Validate pyproject.toml │ │ - Verify commitizen config │ │ │ │ ✓ hacs_validation.yml │ │ - Validate Home Assistant integration │ │ │ │ ✓ codeql-analysis.yml │ │ - Security scanning │ └───────────────────────────────────────────────────────────┘ ↓ All checks pass? → YES → Ready to merge ↓ NO Fix issues and push updates ┌─────────────────────────────────────────────────────────────────────────────┐ │ MERGE TO MASTER STAGE │ └─────────────────────────────────────────────────────────────────────────────┘ PR merged to master ↓ ┌───────────────────────────────────────────────────────────┐ │ bump.yml workflow triggers │ │ │ │ 1. Commitizen analyzes commits since last tag │ │ - feat: → minor bump (0.152.0 → 0.153.0) │ │ - fix: → patch bump (0.152.0 → 0.152.1) │ │ - BREAKING CHANGE → major bump (0.152.0 → 1.0.0) │ │ │ │ 2. Updates version in: │ │ - pyproject.toml │ │ - manifest.json │ │ - const.py │ │ │ │ 3. Creates commit: "bump: version X.Y.Z" │ │ │ │ 4. Creates git tag: X.Y.Z │ │ │ │ 5. Pushes commit and tag to master │ └───────────────────────────────────────────────────────────┘ ↓ Tag pushed → Triggers release workflow ┌─────────────────────────────────────────────────────────────────────────────┐ │ RELEASE STAGE │ └─────────────────────────────────────────────────────────────────────────────┘ Tag X.Y.Z pushed ↓ ┌───────────────────────────────────────────────────────────┐ │ release.yml workflow triggers │ │ │ │ 1. Checkout tagged commit │ │ │ │ 2. Install Poetry and dependencies │ │ │ │ 3. Verify version matches tag │ │ (Poetry version == Git tag) │ │ │ │ 4. Build Python package │ │ poetry build → creates dist/*.whl and dist/*.tar.gz │ │ │ │ 5. Create GitHub Release │ │ - Auto-generated release notes │ │ - Attach build artifacts │ │ │ │ 6. Publish to PyPI │ │ poetry publish → uploads to pypi.org │ └───────────────────────────────────────────────────────────┘ ↓ ┌───────────────────────────────────────────────────────────┐ │ Release Complete! ✓ │ │ │ │ - GitHub Release: github.com/robbrad/.../releases │ │ - PyPI Package: pypi.org/project/uk-bin-collection │ │ - HACS Update: Available to Home Assistant users │ └───────────────────────────────────────────────────────────┘ ┌─────────────────────────────────────────────────────────────────────────────┐ │ COMMIT MESSAGE EXAMPLES │ └─────────────────────────────────────────────────────────────────────────────┘ feat(councils): add Birmingham City Council support → Minor version bump (0.152.0 → 0.153.0) fix(selenium): handle connection timeout errors → Patch version bump (0.152.0 → 0.152.1) feat(api)!: change date format to ISO 8601 BREAKING CHANGE: API responses now use ISO 8601 dates → Major version bump (0.152.0 → 1.0.0) docs: update README with new council instructions → No version bump (documentation only) ┌─────────────────────────────────────────────────────────────────────────────┐ │ TROUBLESHOOTING FLOW │ └─────────────────────────────────────────────────────────────────────────────┘ Issue: Version bump didn't happen ↓ Check: Are commits using conventional format? ↓ NO → Fix commit messages and force push ↓ YES Check: Is PERSONAL_ACCESS_TOKEN set? ↓ NO → Add secret in repo settings ↓ YES Check: Bump workflow logs for errors ↓ Manual fix: Run commitizen locally ───────────────────────────────────────────── Issue: Release didn't publish ↓ Check: Was tag created by bump workflow? ↓ NO → Check bump workflow logs ↓ YES Check: Is PYPI_API_KEY valid? ↓ NO → Update secret in repo settings ↓ YES Check: Release workflow logs for errors ↓ Manual fix: Run poetry publish locally ┌─────────────────────────────────────────────────────────────────────────────┐ │ WORKFLOW DEPENDENCIES │ └─────────────────────────────────────────────────────────────────────────────┘ Secrets Required: ├── PERSONAL_ACCESS_TOKEN (for bump workflow) │ └── Needs: repo write access │ ├── PYPI_API_KEY (for release workflow) │ └── Needs: PyPI project upload permissions │ └── CODECOV_TOKEN (for test coverage) └── Needs: Codecov project access Environments: ├── bump (requires approval) └── release (requires approval) Configuration Files: ├── pyproject.toml │ ├── [tool.poetry] version │ └── [tool.commitizen] config │ ├── custom_components/uk_bin_collection/manifest.json │ └── version field │ └── custom_components/uk_bin_collection/const.py └── INPUT_JSON_URL (includes version) ``` ================================================ FILE: docs/release-workflow-fixes.md ================================================ # Release Workflow Fixes Applied ## Summary Fixed the release workflow to ensure proper version bumping and release publishing from PR merge to master. ## Issues Identified 1. **Bump workflow wasn't explicitly pushing tags** - Commitizen action needed `push: true` parameter - No confirmation output of version bump 2. **Release workflow lacked validation** - No verification that Poetry version matches git tag - Could publish mismatched versions 3. **Missing pre-merge validation** - No check that version files are in sync before merge - Could lead to failed releases 4. **Inconsistent Poetry installation** - Some workflows used `abatilo/actions-poetry` - Others used `pipx install poetry` ## Changes Made ### 1. Updated `.github/workflows/bump.yml` **Changes:** - Added explicit Poetry installation for consistency - Added `push: true` to commitizen action to ensure tags are pushed - Added version output for debugging **Why:** Ensures tags are created and pushed, triggering the release workflow ### 2. Updated `.github/workflows/release.yml` **Changes:** - Added `fetch-depth: 0` to checkout for full git history - Renamed "Run image" step to "Install Poetry" for clarity - Added version verification step to ensure Poetry version matches tag - Split build and publish into separate steps - Added build artifacts to GitHub release **Why:** Prevents publishing mismatched versions and improves reliability ### 3. Created `.github/workflows/validate-release-ready.yml` **New workflow that:** - Validates `pyproject.toml` syntax - Checks version consistency across files - Validates commitizen configuration - Runs on every PR **Why:** Catches version sync issues before merge, preventing failed releases ### 4. Created `docs/release-workflow.md` **Comprehensive documentation covering:** - Complete workflow stages (PR → Merge → Release) - Commit message format and examples - Version numbering strategy - Troubleshooting guide - Manual release procedure - Required secrets and environments **Why:** Provides clear guidance for contributors and maintainers ### 5. Created `docs/release-workflow-diagram.md` **Visual documentation showing:** - ASCII flow diagrams for each stage - Parallel workflow execution - Decision points and error handling - Commit message examples with version impacts - Troubleshooting flows - Dependency tree **Why:** Makes the workflow easy to understand at a glance ## Complete Workflow Flow ``` 1. Developer creates PR ↓ 2. Automated tests run (behave, lint, validate) ↓ 3. PR approved and merged to master ↓ 4. bump.yml triggers: - Analyzes commits - Bumps version in all files - Creates commit "bump: version X.Y.Z" - Creates and pushes tag X.Y.Z ↓ 5. release.yml triggers (on tag push): - Verifies version matches tag - Builds package - Creates GitHub release - Publishes to PyPI ↓ 6. Release complete! ``` ## Testing the Workflow ### Test 1: Version Validation ```bash # Should pass if versions are in sync poetry check jq -r '.version' custom_components/uk_bin_collection/manifest.json poetry version -s ``` ### Test 2: Commit Message Format ```bash # Valid examples: git commit -m "feat(councils): add new council" git commit -m "fix(selenium): handle timeout" git commit -m "docs: update README" # Invalid examples (will fail lint): git commit -m "added new feature" git commit -m "Fixed bug" ``` ### Test 3: Manual Version Bump (if needed) ```bash # Bump version poetry version patch # or minor/major # Update manifest # Edit custom_components/uk_bin_collection/manifest.json # Commit and tag git add . git commit -m "bump: version X.Y.Z" git tag X.Y.Z git push origin master --tags ``` ## Required Secrets Ensure these are set in GitHub repository settings: 1. **PERSONAL_ACCESS_TOKEN** - Settings → Secrets → Actions - Needs: `repo` scope - Used by: bump.yml 2. **PYPI_API_KEY** - Settings → Secrets → Actions - Get from: pypi.org account settings - Used by: release.yml 3. **CODECOV_TOKEN** - Settings → Secrets → Actions - Get from: codecov.io project settings - Used by: test workflows ## Verification Checklist After merging these changes: - [ ] Verify `PERSONAL_ACCESS_TOKEN` secret is set - [ ] Verify `PYPI_API_KEY` secret is set - [ ] Verify `CODECOV_TOKEN` secret is set - [ ] Check bump and release environments exist - [ ] Test with a small PR using conventional commits - [ ] Monitor bump workflow creates tag - [ ] Monitor release workflow publishes to PyPI - [ ] Verify GitHub release is created - [ ] Check PyPI package is available ## Rollback Plan If issues occur: 1. **Disable automatic bumping:** - Add `[skip ci]` to commit messages - Or temporarily disable bump.yml workflow 2. **Manual release:** - Follow manual release procedure in docs/release-workflow.md - Use `poetry version` and `poetry publish` directly 3. **Revert workflow changes:** - Git revert the workflow file changes - Return to previous manual process ## Next Steps 1. Merge these workflow fixes to master 2. Test with a small feature PR 3. Monitor the complete flow 4. Update team documentation if needed 5. Consider adding release notifications (Slack, Discord, etc.) ## Additional Improvements (Future) Consider adding: - Slack/Discord notifications on release - Automated changelog generation - Release candidate (RC) workflow for testing - Automated rollback on failed releases - Release metrics and monitoring - Pre-release testing environment ================================================ FILE: docs/release-workflow-migration.md ================================================ # Release Workflow Migration Guide ## Overview The release workflow has been simplified to use Commitizen and GITHUB_TOKEN, eliminating complexity and manual steps. ## What Changed ### Before (Complex) - Bump workflow created PRs for version bumps - Required PERSONAL_ACCESS_TOKEN secret - Manual PR review and merge for version bumps - Separate environments with approvals - Manual version file syncing - Multiple validation steps ### After (Simplified) - Bump workflow directly commits and tags on master - Uses built-in GITHUB_TOKEN - Fully automated after PR merge - No environment approvals needed - Commitizen auto-syncs all version files - Streamlined validation ## Key Improvements ### 1. Removed PERSONAL_ACCESS_TOKEN Requirement - **Before**: Required creating and managing a personal access token - **After**: Uses built-in `GITHUB_TOKEN` with proper permissions - **Benefit**: One less secret to manage and rotate ### 2. Eliminated Bump PRs - **Before**: Bump workflow created a PR that needed manual merge - **After**: Bump workflow directly commits to master after PR merge - **Benefit**: Faster releases, no manual intervention ### 3. Automatic Version Syncing - **Before**: Manual checks to ensure version files stayed in sync - **After**: Commitizen automatically updates all configured files - **Benefit**: No version mismatch errors ### 4. Simplified Configuration - **Before**: Complex environment setup with approvals - **After**: Simple workflow permissions - **Benefit**: Easier to set up and maintain ### 5. Better CHANGELOG Management - **Before**: Manual or semi-automated changelog updates - **After**: Commitizen automatically generates changelog from commits - **Benefit**: Consistent, automated changelog ## Migration Steps ### 1. Set Up GitHub App Follow the [GitHub App Setup Guide](./github-app-setup.md) to: - Create a GitHub App - Install it on your repository - Generate a private key - Add `APP_ID` and `APP_PRIVATE_KEY` secrets ### 2. Update Secrets ```bash # Add (new requirements) + APP_ID + APP_PRIVATE_KEY # Keep (still required) - PYPI_API_KEY - CODECOV_TOKEN (optional) # Remove (no longer needed) - PERSONAL_ACCESS_TOKEN (if you had one) ``` ### 3. Update Workflow Permissions Ensure Settings → Actions → General → Workflow permissions: - Select "Read and write permissions" - Enable "Allow GitHub Actions to create and approve pull requests" ### 4. Remove Environments (Optional) The `bump` and `release` environments are no longer required, but can be kept if you want manual approval gates. ### 5. Update pyproject.toml Ensure Commitizen configuration includes: ```toml [tool.commitizen] name = "cz_conventional_commits" version_provider = "poetry" version_scheme = "semver" major_version_zero = true tag_format = "$version" update_changelog_on_bump = true version_files = [ "custom_components/uk_bin_collection/manifest.json:version", "custom_components/uk_bin_collection/manifest.json:requirements", "custom_components/uk_bin_collection/const.py:INPUT_JSON_URL" ] ``` ### 6. Test the New Workflow 1. Create a test branch with a conventional commit 2. Open and merge a PR 3. Watch the automated bump and release workflows 4. Verify the release appears on PyPI and GitHub ## Workflow Comparison ### Old Workflow ``` PR → Tests → Merge → Bump Workflow → Bump PR → Manual Review → Merge Bump PR → Tag → Release ``` ### New Workflow ``` PR → Tests → Merge → Bump (auto) → Tag (auto) → Release (auto) ``` ## File Changes ### Modified Files - `.github/workflows/bump.yml` - Simplified to direct commit/tag - `.github/workflows/release.yml` - Cleaned up and uses GITHUB_TOKEN - `.github/workflows/validate-release-ready.yml` - Removed version sync checks - `pyproject.toml` - Enhanced Commitizen configuration - `docs/release-workflow.md` - Updated documentation - `docs/release-workflow-setup-checklist.md` - Simplified checklist - `docs/release-quick-reference.md` - Updated quick reference ### New Files - `docs/release-workflow-migration.md` - This file ## Rollback Plan If you need to rollback to the old workflow: 1. Revert the workflow files: ```bash git revert ``` 2. Re-add PERSONAL_ACCESS_TOKEN secret 3. Recreate bump and release environments ## Benefits Summary ✅ Fewer secrets to manage ✅ Faster release process ✅ No manual intervention needed ✅ Automatic version syncing ✅ Better changelog generation ✅ Simpler configuration ✅ Easier to understand and maintain ## Support If you encounter issues: 1. Check workflow logs in GitHub Actions 2. Review `docs/release-workflow.md` 3. Verify GITHUB_TOKEN permissions 4. Ensure conventional commits are used 5. Check Commitizen configuration ## Next Steps 1. Review the updated documentation 2. Test the new workflow with a small change 3. Monitor the first few automated releases 4. Update team documentation and training 5. Remove old PERSONAL_ACCESS_TOKEN secret ## Questions? See the full documentation: - [Release Workflow](./release-workflow.md) - [Setup Checklist](./release-workflow-setup-checklist.md) - [Quick Reference](./release-quick-reference.md) ================================================ FILE: docs/release-workflow-setup-checklist.md ================================================ # Release Workflow Setup Checklist Use this checklist to verify your simplified release workflow is properly configured. ## GitHub Repository Settings ### GitHub App Setup (for protected branches) - [ ] Create a GitHub App - Go to: https://github.com/settings/apps/new - Name: `UKBinCollection Release Bot` (must be unique) - Homepage URL: `https://github.com/robbrad/UKBinCollectionData` - Uncheck "Active" under Webhook - Repository permissions: - **Contents**: Read and write - **Metadata**: Read-only (auto-selected) - Where can this be installed: "Only on this account" - Click "Create GitHub App" - [ ] Install the app on your repository - Click "Install App" in left sidebar - Click "Install" next to your account - Select "Only select repositories" - Choose `UKBinCollectionData` - Click "Install" - [ ] Generate and save credentials - In app settings, scroll to "Private keys" - Click "Generate a private key" - Save the downloaded `.pem` file securely - Note your **App ID** (shown at top of settings page) ### Secrets Configuration - [ ] `APP_ID` is set - Path: Settings → Secrets and variables → Actions → Repository secrets - Value: Your GitHub App ID (e.g., `123456`) - [ ] `APP_PRIVATE_KEY` is set - Path: Settings → Secrets and variables → Actions → Repository secrets - Value: Entire contents of the `.pem` file - Include the `-----BEGIN RSA PRIVATE KEY-----` and `-----END RSA PRIVATE KEY-----` lines - [ ] `PYPI_API_KEY` is set - Path: Settings → Secrets and variables → Actions → Repository secrets - Get from: https://pypi.org/manage/account/token/ - Scope: Project-specific or account-wide - Test: Should allow publishing packages - [ ] `CODECOV_TOKEN` is set (optional) - Path: Settings → Secrets and variables → Actions → Repository secrets - Get from: https://codecov.io/gh/robbrad/UKBinCollectionData/settings - Test: Should allow uploading coverage reports ### Branch Protection Rules - [ ] `master` branch is protected - Path: Settings → Branches → Add rule - Branch name pattern: `master` - Recommended settings: - [x] Require a pull request before merging - [x] Require status checks to pass before merging - [x] Require branches to be up to date before merging ### Actions Permissions - [ ] Workflows have write permissions - Path: Settings → Actions → General → Workflow permissions - Select: "Read and write permissions" - [x] Allow GitHub Actions to create and approve pull requests ## Local Configuration ### pyproject.toml - [ ] Version is set correctly ```toml [tool.poetry] version = "X.Y.Z" ``` - [ ] Commitizen is configured ```toml [tool.commitizen] name = "cz_conventional_commits" version_provider = "poetry" version_scheme = "semver" major_version_zero = true tag_format = "$version" update_changelog_on_bump = true version_files = [ "custom_components/uk_bin_collection/manifest.json:version", "custom_components/uk_bin_collection/manifest.json:requirements", "custom_components/uk_bin_collection/const.py:INPUT_JSON_URL" ] ``` ## Workflow Files ### Required Workflows - [ ] `.github/workflows/behave_pull_request.yml` exists - [ ] `.github/workflows/lint.yml` exists - [ ] `.github/workflows/validate-release-ready.yml` exists - [ ] `.github/workflows/bump.yml` exists (simplified) - [ ] `.github/workflows/release.yml` exists - [ ] `.github/workflows/hacs_validation.yml` exists ### Workflow Configuration - [ ] `bump.yml` uses `GITHUB_TOKEN` ```yaml token: ${{ secrets.GITHUB_TOKEN }} ``` - [ ] `bump.yml` runs `cz bump --yes --changelog` ```yaml - name: Bump version and create tag run: cz bump --yes --changelog ``` - [ ] `release.yml` uses `PYPI_API_KEY` ```yaml poetry config pypi-token.pypi "${{ secrets.PYPI_API_KEY }}" ``` ## Testing ### Pre-Merge Testing - [ ] Create a test branch ```bash git checkout -b test/release-workflow ``` - [ ] Make a small change with conventional commit ```bash echo "# Test" >> README.md git add README.md git commit -m "fix: test release workflow" ``` - [ ] Push and create PR ```bash git push origin test/release-workflow # Create PR on GitHub ``` - [ ] Verify workflows run: - [ ] `behave_pull_request.yml` runs - [ ] `lint.yml` runs - [ ] `validate-release-ready.yml` runs - [ ] All checks pass ### Post-Merge Testing - [ ] Merge the test PR - [ ] Verify `bump.yml` runs automatically - [ ] Check workflow logs: - [ ] Commitizen analyzed commits - [ ] Version was bumped in all files - [ ] CHANGELOG.md was updated - [ ] Commit was created with message `bump: version X.Y.Z` - [ ] Tag was created and pushed - [ ] No errors in logs - [ ] Verify `release.yml` runs automatically - [ ] Check workflow logs: - [ ] Version verification passed - [ ] Package was built - [ ] GitHub release was created - [ ] PyPI publish succeeded ### Verification - [ ] Check GitHub releases page - URL: https://github.com/robbrad/UKBinCollectionData/releases - Latest release should be visible - Release notes should be auto-generated - Build artifacts should be attached - [ ] Check PyPI package page - URL: https://pypi.org/project/uk-bin-collection/ - Latest version should be available - Package should be installable: ```bash pip install uk-bin-collection==X.Y.Z ``` - [ ] Check version files are synced ```bash # All should show the same version poetry version -s jq -r '.version' custom_components/uk_bin_collection/manifest.json grep INPUT_JSON_URL custom_components/uk_bin_collection/const.py ``` ## Rollback Plan If something goes wrong: ### Delete Bad Release ```bash # Delete tag locally git tag -d X.Y.Z # Delete tag remotely git push origin :refs/tags/X.Y.Z # Delete GitHub release manually on GitHub ``` ### Manual Release ```bash # Bump version with Commitizen cz bump --yes --changelog # Push changes and tags git push origin master --follow-tags # Or manually build and publish poetry build poetry publish ``` ## Maintenance ### Regular Checks - [ ] Quarterly: Verify PYPI_API_KEY hasn't expired - [ ] Quarterly: Update workflow actions to latest versions - [ ] Quarterly: Review and update documentation ### Action Updates Check for updates to GitHub Actions: - `actions/checkout@v5` → Check for newer version - `actions/setup-python@v6` → Check for newer version - `abatilo/actions-poetry@v4.0.0` → Check for newer version - `ncipollo/release-action@v1` → Check for newer version ## Support If you encounter issues: 1. Check workflow logs in GitHub Actions tab 2. Review documentation in `docs/release-workflow.md` 3. Verify GITHUB_TOKEN has write permissions 4. Ensure conventional commits are used 5. Check Commitizen configuration in pyproject.toml ## Sign-Off - [ ] All checklist items completed - [ ] Test release successful - [ ] Documentation reviewed **Completed by:** _______________ **Date:** _______________ ================================================ FILE: docs/release-workflow.md ================================================ # Release Workflow Documentation ## Overview This document describes the complete release workflow from pull request to published release. ## Workflow Stages ### 1. Pull Request Stage **Triggers:** When a PR is opened/updated targeting `master` branch **Workflows that run:** - `behave_pull_request.yml` - Runs tests on changed councils - `lint.yml` - Validates commit messages follow conventional commits - `validate-release-ready.yml` - Validates pyproject.toml and commit messages - `hacs_validation.yml` - Validates Home Assistant integration **What happens:** - Unit tests run on Python 3.12 - Integration tests run only for changed council files - Parity check ensures councils, input.json, and feature files are in sync - Commit messages are validated against conventional commits format - pyproject.toml is validated **Requirements to merge:** - All tests must pass - Commit messages must follow conventional commits format - Code must pass linting ### 2. Merge to Master Stage **Triggers:** When PR is merged to `master` branch **Workflow that runs:** - `bump.yml` - Automatically bumps version and creates release **What happens:** 1. Commitizen analyzes commit messages since last tag 2. Determines version bump type (major/minor/patch) based on conventional commits: - `feat:` → minor version bump - `fix:` → patch version bump - `BREAKING CHANGE:` → major version bump 3. Updates version in all configured files: - `pyproject.toml` - `custom_components/uk_bin_collection/manifest.json` - `custom_components/uk_bin_collection/const.py` 4. Updates CHANGELOG.md 5. Creates a commit with message `bump: version X.Y.Z` 6. Creates and pushes a git tag `X.Y.Z` 7. Pushes the commit and tag to master **Note:** The bump workflow is skipped if the commit message starts with `bump:` to prevent infinite loops. ### 3. Release Stage **Triggers:** When a tag is pushed (automatically by bump workflow) **Workflow that runs:** - `release.yml` - Publishes the release **What happens:** 1. Checks out the tagged commit 2. Verifies the Poetry version matches the git tag 3. Builds the Python package with Poetry 4. Creates a GitHub release with auto-generated release notes 5. Publishes the package to PyPI 6. Attaches build artifacts to the GitHub release ## Commit Message Format Follow [Conventional Commits](https://www.conventionalcommits.org/) format: ``` ():