Kampi

Members
  • Content Count

    19
  • Joined

  • Last visited

Everything posted by Kampi

  1. Kampi

    XADC - AD7 sampled on AD14

    It seems that the inputs are disturbing each other. When I apply the input voltage to only one input all four LEDs change her brightness by increasing and decreasing the voltage. When I connect an input voltage to AD14 and AD7 I can change the brightness of LED M15 with AD7 and the brightness of all other three LEDs with AD14. I can manipulate the brightness of LED D18 when I connect AD6 with the input voltage and all other LEDs with AD7.
  2. Kampi

    XADC - AD7 sampled on AD14

    Hello @jpeyron I´ve copied the settings for the XADC and the software from your last project (June 24 2017) from this thread and modify the code to read Vaux15, Vaux 14 and Vaux7 ExtVolRawData = XSysMon_GetAdcData(SysMonInstPtr, XSM_CH_AUX_MAX); ExtVolData = XSysMon_RawToExtVoltage(ExtVolRawData); printf("The Current Vaux15 is %0d.%03d Volts. \r\n", (int)(ExtVolData), SysMonFractionToInt(ExtVolData)); ExtVolRawData = XSysMon_GetAdcData(SysMonInstPtr, XSM_CH_AUX_MIN + 7); ExtVolData = XSysMon_RawToExtVoltage(ExtVolRawData); printf("The Current Vaux7 is %0d.%03d Volts. \r\n", (int)(ExtVolData), SysMonFractionToInt(ExtVolData)); ExtVolRawData = XSysMon_GetAdcData(SysMonInstPtr, XSM_CH_AUX_MAX - 1); ExtVolData = XSysMon_RawToExtVoltage(ExtVolRawData); printf("The Current Vaux14 is %0d.%03d Volts. \r\n", (int)(ExtVolData), SysMonFractionToInt(ExtVolData)); usleep(500000); But the problem still exist. And the issue exist in the HDL XADC project too. The brightness of the LEDs M14 and M15 changes when I modify the voltage on pin AD7 of the Zybo. By the way... The brightness of LED D18 and G14 changes with the voltage on pin AD15.
  3. Kampi

    XADC - AD7 sampled on AD14

    A small update: It seems that the channel sequencer has something to do with this behavior. This problem exist when I use XADCPS_SEQ_MODE_CONTINPASS and XADCPS_SEQ_MODE_ONEPASS, but disapears when I use XADCPS_SEQ_MODE_SIMUL_SAMPLING.
  4. Kampi

    XADC - AD7 sampled on AD14

    Hello, I have a Zybo Board (Version 1 Rev. and I have a strange Issue with my XADC which samples the input for the channel AD7 on the channel AD14. Please take a look at my setup. I want to use the differential Channelpair 7 and 15 (like in the Photo - upper row VIn and lower row ground from my voltage source). My software gives me the results for channel 14 and 15 and the value for channel 7 stays constant even when I increase or decrease the input voltage. Only channel 14 and 15 change her values. I expect that channel 14 stays constant and channel 7 change his value. Temperature: 46.1576 Degree Celsius Vcc INT: 0.9961 V Vref+: 1.25 V Vref-: 3.00 V Channel 7: 2351 Channel 14: 26032 Channel 15: 32767 My code looks like this #include "stdio.h" #include "xparameters.h" #include "xadcps.h" XAdcPs XAdc; XAdcPs_Config* ConfigPtr; int main() { ConfigPtr = XAdcPs_LookupConfig(XPAR_XADC_DEVICE_ID); if(ConfigPtr == NULL) { xil_printf("Invalid XADC configuration!"); return XST_FAILURE; } XAdcPs_CfgInitialize(&XAdc, ConfigPtr, ConfigPtr->BaseAddress); if(XAdcPs_SelfTest(&XAdc) != XST_SUCCESS) { xil_printf("Self test failed!"); return XST_FAILURE; } XAdcPs_Reset(&XAdc); XAdcPs_SetSeqChEnables(&XAdc, XADCPS_SEQ_CH_AUX07 | XADCPS_SEQ_CH_AUX14 | XADCPS_SEQ_CH_AUX15); XAdcPs_SetSequencerMode(&XAdc, XADCPS_SEQ_MODE_CONTINPASS); xil_printf("Start...\n\r"); while(1) { u32 Temp = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_TEMP); printf("Temperature: %.4f Degree Celsius\n\r", XAdcPs_RawToTemperature(Temp)); u32 VCCInt = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_VCCINT); printf("Vcc INT: %.4f V\n\r", XAdcPs_RawToVoltage(VCCInt)); u32 VREFp = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_VREFP); printf("Vref+: %.2f V\n\r", XAdcPs_RawToVoltage(VREFp)); u32 VREFn = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_VREFN); printf("Vref-: %.2f V\n\r", XAdcPs_RawToVoltage(VREFn)); u32 Ch7 = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_AUX_MIN + 7); xil_printf("Channel 7: %lu\n\r", Ch7); u32 Ch14 = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_AUX_MAX - 1); xil_printf("Channel 14: %lu\n\r", Ch14); u32 Ch15 = XAdcPs_GetAdcData(&XAdc, XADCPS_CH_AUX_MAX); xil_printf("Channel 15: %lu\n\r", Ch15); xil_printf("-------------\n\r"); for(u32 i = 0x00; i < 0xFFFFFF; i++); } return XST_SUCCESS; } With the following XDC file: ##Pmod Header JA (XADC) set_property IOSTANDARD LVCMOS33 [get_ports Vaux14_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux14_v_p] set_property PACKAGE_PIN N16 [get_ports Vaux14_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux6_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux6_v_p] set_property IOSTANDARD LVCMOS33 [get_ports Vaux7_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux7_v_p] set_property IOSTANDARD LVCMOS33 [get_ports Vaux15_v_n] set_property IOSTANDARD LVCMOS33 [get_ports Vaux15_v_p] So what is going wrong here?
  5. Hello, I got the video example running. You can find the first version here: https://gitlab.com/Kampi/Zybo/tree/master/Examples/Video
  6. Hey thank you for your answer. But I think taking a look at the HDMI out demo should be the better way. What do you think?
  7. Hello, I try to generate a VGA signal with a VDMA, Video Timing, and AXI4 Stream to Video Out IP for my Zybo. So I create the following block design with the given settings (Note: I test the design with the test pattern generator instead of the VDMA before, so I know that the settings of the Timing Generator and Video Out IP are correct). My code looks like this: #ifdef WITH_TESTPATTERN #include "xv_tpg.h" #endif #include "xaxivdma.h" #include "xparameters.h" #ifdef WITH_TESTPATTERN XV_tpg TPG; XV_tpg_Config* TPG_Config; #endif XAxiVdma_Config* VDMA_Config; XAxiVdma VDMA; XAxiVdma_DmaSetup ReadConfiguration; unsigned int frame_buffer[800][600][3]; unsigned int srcBuffer; u32 Status; void fill(void) { for(u32 i = 0x00; i < 800; i++) { for(u32 j = 0x00; j < 600; j++) { frame_buffer[i][j][0] = 0xFF; frame_buffer[i][j][1] = 0xFF; frame_buffer[i][j][2] = 0xFF; } } } int main() { #ifdef WITH_TESTPATTERN TPG_Config = XV_tpg_LookupConfig(XPAR_TESTPATTERN_DEVICE_ID); if(!TPG_Config) { xil_printf("Error during test pattern generator configuration!\n\r"); return -1; } Status = XV_tpg_CfgInitialize(&TPG, TPG_Config, TPG_Config->BaseAddress); if(Status != XST_SUCCESS) { xil_printf("Error during test pattern generator initialization!\n\r"); return -1; } XV_tpg_Set_height(&TPG, 600); XV_tpg_Set_width(&TPG, 800); XV_tpg_Set_bckgndId(&TPG, 0x0C); XV_tpg_EnableAutoRestart(&TPG); XV_tpg_Start(&TPG); #endif VDMA_Config = XAxiVdma_LookupConfig(XPAR_VIDEODMA_DEVICE_ID); if(!VDMA_Config) { xil_printf("Error during VDMA configuration!\n\r"); return -1; } Status = XAxiVdma_CfgInitialize(&VDMA, VDMA_Config, VDMA_Config->BaseAddress); if(Status != XST_SUCCESS) { xil_printf("Error during VDMA initialization!\n\r"); return -1; } ReadConfiguration.VertSizeInput = 600; ReadConfiguration.HoriSizeInput = 800 * (VDMA_Config->Mm2SStreamWidth >> 3); ReadConfiguration.Stride = 800 * (VDMA_Config->Mm2SStreamWidth >> 3); ReadConfiguration.FrameDelay = 0; ReadConfiguration.EnableCircularBuf = 1; ReadConfiguration.EnableSync = 0; ReadConfiguration.PointNum = 0; ReadConfiguration.EnableFrameCounter = 0; ReadConfiguration.FixedFrameStoreAddr = 0; Status = XAxiVdma_DmaConfig(&VDMA, XAXIVDMA_READ, &ReadConfiguration); if(Status != XST_SUCCESS) { xil_printf("Read channel configuration failed!\n\r"); return -1; } fill(); Status = XAxiVdma_DmaSetBufferAddr(&VDMA, XAXIVDMA_READ, (UINTPTR*)frame_buffer); if(Status != XST_SUCCESS) { xil_printf("Read channel set buffer address failed!\n\r"); return -1; } Status = XAxiVdma_DmaStart(&VDMA, XAXIVDMA_READ); if(Status != XST_SUCCESS) { xil_printf("Failed to start DMA engine (read channel)!\n\r"); return -1; } xil_printf("Start...\n\r"); while(1) { } return 0; } But the monitor doesn´t show the picture (and no message that the signal is missing, so HSync and VSync work) and I´ve got the terminal message Read channel set buffer address failed! So what is wrong with the code? Thank you guys
  8. Sorry for the late answer. I got my ethernet running now. I think my Yocto image was the key of the problem. With the Xilinx ramdisk example, the ethernet works well, so I create a new Yocto image today and now my ethernet works very with my new SD-Card image. Thank you for help. If someone else needs a working image for the Zybo, please check my git project https://github.com/Kampi/Zybo-Linux There you will find all the necessary files.
  9. Hello, could this problem have something to do with missing ip tables?
  10. Hello, yes I think I have to test it with wireshark (will do it this week). The connection between Switch and PC works, because I have my raspberry pi connected with that switch and I can ping it (maybe I use an uplink port of that switch? Just an idea, i,ve got in this moment).
  11. Hello, the settings for my Zybo: root@linaro-developer:~# ifconfig eth0 Link encap:Ethernet HWaddr 00:0a:35:00:01:22 inet addr:192.168.178.160 Bcast:192.168.178.255 Mask:255.255.255.0 UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:146 Base address:0xb000 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:65536 Metric:1 RX packets:3054 errors:0 dropped:0 overruns:0 frame:0 TX packets:3054 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1 RX bytes:290010 (283.2 KiB) TX bytes:290010 (283.2 KiB) root@linaro-developer:~# Please see the screenshots for my settings:
  12. Hello, thank you. The ethernet interface is working now, but I can´t ping the zybo from my PC. IP-address etc. should be ok, because I can ping another device if I plug the LAN-cable from the zybo into another device. Is there anything else which can do "problems" and block my ping?
  13. Great. This solution works. Now I have an ethernet device. But why is this working?
  14. Hello, I connect MDIO to MIO 52 and 53 now. IP link doesn´t show any interface root@linaro-developer:~# ip link show 1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2: sit0@NONE: <NOARP> mtu 1480 qdisc noop state DOWN mode DEFAULT group default qlen 1 link/sit 0.0.0.0 brd 0.0.0.0 root@linaro-developer:~# And I get this messages: Failed to start Raise network interfaces. See 'systemctl status networking.service' for details. [ OK ] Reached target Network. root@linaro-developer:~# systemctl status networking.service ● networking.service - Raise network interfaces Loaded: loaded (/lib/systemd/system/networking.service; enabled; vendor prese Drop-In: /run/systemd/generator/networking.service.d └─50-insserv.conf-$network.conf Active: failed (Result: exit-code) since Sat 2016-05-21 22:31:35 UTC; 1min 28 Docs: man:interfaces(5) Main PID: 1787 (code=exited, status=1/FAILURE) May 21 22:31:34 linaro-developer ifup[1787]: Cannot find device "eth0" May 21 22:31:34 linaro-developer ifup[1787]: Failed to bring up eth0. May 21 22:31:34 linaro-developer ntpdate[1907]: Can't find host 0.debian.pool.nt May 21 22:31:34 linaro-developer ntpdate[1907]: Can't find host 1.debian.pool.nt May 21 22:31:34 linaro-developer ntpdate[1907]: Can't find host 2.debian.pool.nt May 21 22:31:34 linaro-developer ntpdate[1907]: Can't find host 3.debian.pool.nt May 21 22:31:34 linaro-developer ntpdate[1907]: no servers can be used, exiting May 21 22:31:35 linaro-developer systemd[1]: Failed to start Raise network inter May 21 22:31:35 linaro-developer systemd[1]: networking.service: Failed with res lines 1-17/17 (END) root@linaro-developer:~# dmesg | grep ethernet [ 0.808309] macb e000b000.ethernet: failed to get macb_clk (4294967294) [ 0.813579] macb: probe of e000b000.ethernet failed with error -2
  15. Hello, thank you for your answer. Please take a look at "Ethernet.png" for my ethernet settings. I use the same as in this thread. But I don´t have any ethernet interface when I use the "ip link show" command after booting linux. root@linaro-developer:~# ip link show 1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2: sit0@NONE: <NOARP> mtu 1480 qdisc noop state DOWN mode DEFAULT group default qlen 1 link/sit 0.0.0.0 brd 0.0.0.0 root@linaro-developer:~# Should I make that MDIO-Port external?
  16. Hello, my ethernet for my Zybo doesn´t work with Linux. I use this linaro as my rootfs. Additional I follow these instructions to create my Design with all necessary files. My Linux boots up, but I got the message "no phy found". I´ve update my device tree with this: ZyboEthernet.dtsi &gem0 { #address-cells = <1>; #size-cells = <0>; clock-names = "ref_clk", "aper_clk"; clocks = <&clkc 13>, <&clkc 30>; compatible = "cdns,zynq-gem", "cdns,gem"; interrupt-parent = <&intc>; interrupts = <0 22 4>; local-mac-address = [00 0a 35 00 00 00]; phy-handle = <&phy0>; phy-mode = "rgmii-id"; status = "okay"; reg = <0xe000b000 0x1000>; xlnx,eth-mode = <0x1>; xlnx,has-mdio = <0x1>; xlnx,ptp-enet-clock = <111111115>; mdio { #address-cells = <1>; #size-cells = <0>; phy0: phy@1 { compatible = "realtek,RTL8211E"; device_type = "ethernet-phy"; reg = <1>; }; }; }; Bootargs.dts /include/ "system-top.dts" /include/ "ZyboEthernet.dtsi" / { chosen { bootargs = "console=ttyPS0,115200n8 consoleblank=0 root=/dev/mmcblk0p2 rw rootwait earlyprintk"; linux,stdout-path = "&uart1"; }; }; How can I get my ethernet running? Thank you for help!
  17. Hello, thanks for your reply. I don´t have self powered speakers, so I have to buy some :(.... I think I could use . The PMOD-AMP3 looks like a good alternative, but it has I²C for configuration and I want to test I²S alone and without I²C configuration. Otherwise I could choose the audio codec on my ZYBO (which will be the next step after I²S is running).
  18. Hello, I receive my I2C Audio PMOD (http://store.digilentinc.com/pmodi2s-stereo-audio-output/) today and I want to use it with my ZYBO. My I2S transmitter look like this: library ieee; use ieee.std_logic_1164.all; entity I2S_Transmitter is generic ( DATA_WIDTH : integer := 32 ); port ( Clock : in STD_LOGIC; MCLK : out STD_LOGIC; Data_In : in STD_LOGIC_VECTOR(DATA_WIDTH - 1 downto 0); LRCLK : Out STD_LOGIC; DOUT : out STD_LOGIC; Reset : in STD_LOGIC; Empty : out STD_LOGIC ); end entity; architecture I2S_Transmitter_Arch of I2S_Transmitter is signal InputBuffer : std_logic_vector(DATA_WIDTH - 1 downto 0) := (others => '0'); signal Empty_Signal : std_logic := '1'; signal DOUT_D1 : std_logic := '0'; signal WS : std_logic := '0'; signal Counter : integer := 0; begin LRCLK_Logic: process (Clock) begin if(falling_edge(Clock)) then if(Reset = '1') then Counter <= 0; WS <= '1'; else Counter <= Counter + 1; if(Counter > (DATA_WIDTH / 2 - 1)) then WS <= '1'; else WS <= '0'; end if; if(Counter = (DATA_WIDTH - 1)) then Counter <= 0; end if; end if; end if; end process; Load_and_shift_out_Data: process(Clock) begin if(falling_edge(Clock)) then if(Reset = '1') then InputBuffer <= (others => '0'); Empty_Signal <= '1'; else if(Empty_Signal = '1') then InputBuffer <= Data_In; Empty_Signal <= '0'; else InputBuffer <= InputBuffer((DATA_WIDTH - 2) downto 0) & '0'; DOUT_D1 <= InputBuffer(DATA_WIDTH - 1); end if; if(Counter = (DATA_WIDTH - 1)) then Empty_Signal <= '1'; end if; end if; end if; end process; MCLK <= Clock; LRCLK <= WS; DOUT <= DOUT_D1; Empty <= Empty_Signal; end architecture I2S_Transmitter_Arch; With this Testbench: library IEEE; use IEEE.STD_LOGIC_1164.ALL; entity I2S_Transmitter_TB is -- Port ( ); end I2S_Transmitter_TB; architecture I2S_Transmitter_TB_Arch of I2S_Transmitter_TB is signal MCLK : std_logic; signal WS : std_logic := '0'; signal SCK : std_logic := '1'; signal DOUT : std_logic; signal Empty : std_logic := '0'; signal Reset : std_logic := '0'; signal Data_In : std_logic_vector(31 downto 0) := (others => '0'); -- 8,192 MHz Clock constant clk_period: Time := 122 ns; constant ws_period: Time := 16 * clk_period; begin DUT: entity work.i2s_transmitter port map ( LRCLK => WS, MCLK => MCLK, Data_In => Data_In, Clock => SCK, DOUT => DOUT, Empty => Empty, Reset => Reset ); CLOCK: process begin wait for clk_period / 2; SCK <= not SCK; end process; STIMULUS: process begin Data_In <= x"507B507B"; Reset <= '1'; wait for 10 us; Reset <= '0'; wait for 10 us; Data_In <= (others => '0'); wait for 30 ms; Reset <= '1'; wait for 5 ms; end process; end I2S_Transmitter_TB_Arch; I use Port JC on my ZYBO and this is my XDC-File: # Clock set_property PACKAGE_PIN L16 [get_ports Clock_In] set_property IOSTANDARD LVCMOS33 [get_ports Clock_In] set_property PACKAGE_PIN V15 [get_ports MCLK] set_property IOSTANDARD LVCMOS33 [get_ports MCLK] set_property PACKAGE_PIN W15 [get_ports LRCLK] set_property IOSTANDARD LVCMOS33 [get_ports LRCLK] set_property PACKAGE_PIN T10 [get_ports DOUT] set_property IOSTANDARD LVCMOS33 [get_ports DOUT] set_property IOSTANDARD LVCMOS33 [get_ports DEM] set_property PACKAGE_PIN T11 [get_ports DEM] The clock frequency for MCLK is 8,192MHz, generated with a MMCM. I´ve connected this speaker (http://store.digilentinc.com/speaker/) to the PMOD, but I doesn´t hear any sound. Why? Thank you for help!